CN105814532A - Approaches for three-dimensional object display - Google Patents

Approaches for three-dimensional object display Download PDF

Info

Publication number
CN105814532A
CN105814532A CN201480051260.XA CN201480051260A CN105814532A CN 105814532 A CN105814532 A CN 105814532A CN 201480051260 A CN201480051260 A CN 201480051260A CN 105814532 A CN105814532 A CN 105814532A
Authority
CN
China
Prior art keywords
equipment
user
computing equipment
webpage
interface element
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201480051260.XA
Other languages
Chinese (zh)
Inventor
查理·埃姆斯
丹尼斯·皮拉里诺斯
彼得·弗兰克·希尔
莎莎·迈克尔·佩雷斯
蒂莫西·托马斯·格雷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Amazon Technologies Inc
Original Assignee
Amazon Technologies Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US14/029,736 external-priority patent/US10592064B2/en
Priority claimed from US14/029,747 external-priority patent/US20150082145A1/en
Priority claimed from US14/029,756 external-priority patent/US10067634B2/en
Application filed by Amazon Technologies Inc filed Critical Amazon Technologies Inc
Publication of CN105814532A publication Critical patent/CN105814532A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/161Indexing scheme relating to constructional details of the monitor
    • G06F2200/1614Image rotation following screen orientation, e.g. switching from landscape to portrait mode
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/163Indexing scheme relating to constructional details of the computer
    • G06F2200/1637Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)
  • Position Input By Displaying (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

Approaches enable three-dimensional (3D) display and interaction with interfaces (such as a webpage, an application, etc.) when the device is operating in a 3D view mode. For example, interface elements can be highlighted, emphasized, animated, or otherwise altered in appearance, and/or arrangement in the renderings of those interfaces based at least in part on an orientation of the device or a position of a user using the device. Further, the 3D view mode can provide for an animated 3D departure and appearance of elements as the device navigates from a current page to a new page. Further still, approaches provide for the ability to specify 3D attributes (such as the appearance, action, etc.) of the interface elements. In this way, a developer of such interfaces can use information (e.g., tags, CSS, JavaScript, etc.) to specify a 3D appearance change to be applied to at least one interface element when the 3D view mode is activated.

Description

For the method that three dimensional object shows
Background technology
People are mutual with computer and other electronic equipments to perform multiple-task more and more.In order to help user navigate in more directly perceived and user-friendly mode and perform these tasks, interface provides new view and the type of interactivity more and more.A kind of such method includes animation and shows, highlights or otherwise emphasize that user is likely to the aspect of content interested.The method of even now can strengthen user's experience when mutual with such content, but in many cases, not with intuitive way tissue or present content.Such as, being present in the mode on the display screen of computing equipment due to content, user is likely to be difficult to position desired content.Additionally, due to the little overall dimensions of portable computing device, the content that generally cannot be displayed on webpage, file, application program and/or list displays to the user that in directly perceived and user-friendly mode.
Accompanying drawing explanation
Various embodiments according to the disclosure are described with reference to the accompanying drawings, in the accompanying drawings:
Fig. 1 illustrates the example scenario according to various embodiments, and wherein user can check content and mutual with computing equipment;
Fig. 2 illustrates the exemplary interfaces state being presented according to embodiment;
Fig. 3 (a) and 3 (b) illustrate the various interface states being presented according to various embodiments;
Fig. 4 (a) and 4 (b) illustrate the various interface states being presented according to various alternate embodiment;
Fig. 5 (a) and 5 (b) illustrate the various interface states being presented according to various alternate embodiment;
Fig. 6 (a), 6 (b), 6 (c) and 6 (d) illustrate the various interface states being presented according to various alternate embodiment;
Fig. 7 (a), 7 (b) and 7 (c) illustrate the various interface states being presented according to various alternate embodiment;
Fig. 8 (a) and 8 (b) illustrate the various interface states being presented according to various alternate embodiment;
Fig. 9 (a), 9 (b) and 9 (c) illustrate the various interface states being presented according to various alternate embodiment;
Figure 10 illustrates the interface state being presented according to various alternate embodiment;
Figure 11 illustrates the example process for presenting various interface according to various embodiments;
Figure 12 (a), 12 (b), 12 (c) and 12 (d) illustrate the various interface states being presented according to various alternate embodiment;
Figure 13 illustrates the example process for presenting various interface according to various embodiments;
Figure 14 illustrates the example process for presenting various interface according to various embodiments;
Figure 15 (a) and Figure 15 (b) illustrates the illustrative methods of the relative position for determining user that can utilize according to various embodiments;
Figure 16 (a) and Figure 16 (b) illustrates the illustrative methods for determining equipment moving that can utilize according to various embodiments;
Figure 17 illustrates the example devices that can be used to realize the aspect of various embodiment;
Figure 18 illustrates the example components of client device, described client device client device as shown in Figure 17;And
Figure 19 illustrates the environment that can realize various embodiment.
Detailed description of the invention
It is one or more that the system and method for the various embodiments according to the disclosure can overcome in the aforesaid drawbacks of experience in the conventional method for allowing user and electronic equipment mutual and other shortcomings.Specifically, when equipment operation in three-dimensional (3D) view mode, various methods allow 3D to show and mutual with interface (such as webpage, content page, application program etc.).Such as, interface element is highlighted when presenting those interfaces, emphasize, animation shows or otherwise changes outward appearance and/or layout for the permission of various embodiments.This can include making the element quasi-three-dimensional on 3D display or two dimension (2D) display element present in " forward " towards the front of display screen.Such as, interface element can seem to be positioned and/or show in the 3 d space, make some interface element (such as, text, image etc.) in the 3D degree of depth, become much larger and/or seem the surface of the display screen closer to computing equipment, and other interface elements (such as, advertisement) " retrogressing " or seem less in the 3D degree of depth.Along with user tilts, rotates or otherwise change the orientation of equipment, or making gesture (such as, shake on equipment) or touch input, interface element can move forward and backward or otherwise change shape or outward appearance.In some cases, the amount of the 3D degree of depth and outward appearance can based on user profiles or document profiles, and described user profiles can be used to provide the more personalized view of content, described document profile to can be used to provide default view content.In various embodiments, the outward appearance at interface is made to change outward appearance based on user for the relative position of equipment, described relative position can be determined in the following manner: uses at least one video camera of computing equipment to catch image, and analyzes described image to determine the head of user or the eyes relative position relative to equipment.The orientation of computing equipment and/or the change of position it be also possible to use at least one motion sensor of equipment and determine, to provide than using the sample frequency that the image information caught by video camera sample frequency in the cards is high in other cases, or additionally attempt to improve relative position and determine structure.In some cases, remotely separate with equipment at equipment or orientation and/or the position of detection equipment can be used in other cases with the sensor of equipment communication.Can receive at equipment place the directed information of sensor, and equipment can cause the outward appearance at interface to be based at least partially on the orientation and/or positional information that receive and change.In some cases, sensor the orientation received and/or positional information can be utilized to activate 3D view mode.Therefore, user can check the interface element at interface and mutual with the interface element at interface, and the various methods discussed herein and advise can be used to be operated by various interface elements.
In certain embodiments, when receiving from the first content page (such as, webpage) when navigating to the request of the second content page, equipment can cause at least subset of animation display interface element on a display screen to leave (departure) from the 3D of the 3D view of the first content page.Such as, content page (such as, webpage) can include one or more interface element, such as image, title, article text etc..As described, when equipment operates in 3D view mode, interface element can highlighted, emphasize, animation shows or otherwise changes outward appearance and/or layout.According to various embodiments, may be in response to multiple different activation event to activate 3D view mode.Such as, may be in response to inspection to the orientation of equipment and/or position change (e.g., equipment inclination, rotate, rock) activate 3D view mode.Additionally or alternatively, can be inputted by touch input, phonetic entry and/or gesture and other input types activate 3D view mode.In some cases, 3D view mode can be activated when user runs specific application program.In various embodiments, when user is optional operates application program in 3D view mode and operates what kind of application program in 3D view mode.Such as, whether user can control 3D view mode movable all the time, from inertia or movable lasting specific interval.User may select to operate which application program (if any) in 3D view mode.Additionally, as described, distance sensor can be passed through and activate 3D view mode.Such as, when equipment be in the predetermined of sensor or can in detection range time, 3D view mode can be activated.
According to various embodiments, along with equipment navigates to the new page from current page, 3D view mode may also provide the animation display 3D of element and leaves and occur.Animation may be included in " whereabouts " first page in 3d space, and wherein along with interface element falls, at least subset of interface element can be shown as advancing away from user at different rates.Therefore, interface element can be shown in when interface element exits display screen to rotate with different speed in a number of different manners, rotate or animation shows.Animation also can illustrate that the 3D of one or more interface elements of the second content page of the interface element appearance substituting first page occurs on a display screen.Such animation can be advantageously used to cover time delay when loading the page, and can be used to amendment or otherwise strengthen the appearance of the page.It should be noted that according to the embodiments described herein, other animations various are possible.
Various embodiments provide the ability of the 3D attribute (such as outward appearance, action etc.) specifying interface element.Such as, the developer of such page can use information (such as, label, CSS, JavaScript etc.) to specify the 3D cosmetic variation being applied at least one interface element when activating 3D view mode.In various embodiments, when developer does not specify 3D cosmetic variation, interface can determine that the 3D cosmetic variation of at least one element being applied to interface.Such as, interface may utilize DOM Document Object Model (DOM) hierarchy or other model hierarchy that include information that to determine how display or otherwise to change interface element outward appearance (such as passing through make the element " forward " front towards display screen or make the element front " retrogressing " from display screen) in the 3 d space.
Below with reference to each embodiment, other application various, process and purposes are proposed.
Fig. 1 illustrates example scenario 100, and wherein user 102 is mutual with computing equipment 104.Although illustrating portable computing device (such as, smart mobile phone, E-book reader or tablet PC), it should be appreciated that can determine and process the various other kinds of electronic equipment of input and can use according to various embodiments discussed herein.These equipment can include such as notebook, personal digital assistant, video game console or controller, portable electronic device and wearable computer (such as, intelligent watch, intelligent glasses etc.) and other equipment.In this example, computing equipment 104 includes may operate to perform such as the video camera 106 of image and/or Video Capture function.Equipment may also comprise other image capturing component, as included at least one other video camera, charge (CCD), motion detection sensor or infrared sensor and other elements.In this example, user 102 is positioned in the relative position relative to equipment so that the visual angle of user is followed the detected track 110 between head or eyes and the equipment of user and moved.As discussed herein, equipment can use the information of the information of the head of information as directed in instruction equipment and instruction user or the position of eyes, interface is presented on other these class components of display screen or computing equipment, when operate in 3D view mode with box lunch equipment, permission 3D shows and mutual with interface (such as, webpage).According to various embodiments, when operation in 3D view mode, display screen provides 3D or the outward appearance of class 3D behavior, but is probably the 2D display of standard.Therefore, various embodiments allow interface element (such as, image, text, advertisement etc.) highlighted when presenting those interfaces, such as the position based on the orientation of computing equipment and the head of user or eyes, highlighted towards the front of display screen by " forward " in making the element quasi-three-dimensional on 3D display or two dimension (2D) display element present.When the visual angle relative to computing equipment of user changes due to the movement of user and/or computing equipment and the change of equipment orientation, renewal can will be presented.Additionally, present available 3D map information (such as one group of layer depth or z horizontal plane (z-level)) to determine how and to be relative to each other by various interface elements.
Such as, Fig. 2 illustrates the exemplary interfaces on the display screen 202 of computing equipment 200.In this example, user checks that the conventional 2D of webpage 201 represents.Such as many conventional web page, can be depending on content type, the labelling of webpage, the vision sorter of webpage and/or the white space analysis of webpage and the region of webpage is divided into zone or region.In this example, webpage includes title 204, article text 206, at least one image 208, at least one link 210, advertisement 214 and other links 216 various.It should be understood, however, that the aspect of various embodiments can use together with various types of interfaces, described interface can include the multiple different interface element that user can interact with.In this example, user can select that interface element by the region being associated with interface element in touch display screen 202.For allowing the method that user uses finger or other such objects and touch-sensitive display panel mutual to be widely known by the people in the art, and therefore will not be discussed in detail herein.It is used as other systems of selection various, such as moving cursor, makes gesture or say voice command and additive method.
As mentioned, when equipment operates in 3D view mode, equipment changes presenting of interface element, as to illustrate that the 3D of interface element represents.In various embodiments, user can enter such pattern by the orientation (as by tilting, rotate, translate, flick or otherwise adjusting the equipment relative orientation relative to user) of change equipment.Various additive methods can be used to cause equipment to enter 3D view mode, and described additive method such as voice command, touch input, gesture input and other can detect input.Hereafter, can cause one or more interface element in the 3 d space with different altitude locations and/or display.In some embodiments, the height of each interface element can be relevant to the things that the user of equipment is most interested in.So, the height of interface element can visually indicate relevant or important information.Such as, the most important part of article can be lifted, so that described part is maximum and is easiest to read.Less important content can retreat in the 3 d space.Such as, advertisement can seem recessed from display screen.In some cases, advertisement can compete the height on webpage, and wherein more expensive or more relevant advertisement can show get Geng Gao than other advertisements or interface element in space.
As shown in Fig. 3 (a), after the activation of 3D view mode being detected, on the display screen 302 of computing equipment 300, the interface 301 of display can be rendered as and have " horizontal plane " or the z-degree of depth that at least two (and more in many cases) is different, the upper horizontal plane of some of them interface element is rendered as and seems the outer surface close to display screen, and the upper horizontal plane of other interface elements can be rendered as and seem to be positioned at (such as, separating a segment distance with the outer surface of display screen) more less than interface horizontal plane.As described, interface element can include title 304, article text 306, at least one image 308, at least one link 310, advertisement 314 and other links 316 various etc..
In various embodiments, it is rendered as the interface element seeming to be positioned at upper horizontal plane and can be intended to those even more important or relevant for a user elements.These elements can include peer link, article image etc..As herein after a while by description, important or relevant element and/or content can be determined based on the information about user, as element in one or more user profiles and/or content can be included in.In this example, link 316 elements to be rendered as and seem to be positioned at upper horizontal plane.Such presenting can take various forms, and such as slightly amplifier element, generation and/or the tone presented adjusting element, increases acutance or the focusing level of element, adjusts color or the tone of element.Other elements various such as image 308 element can be rendered as and also appear to be positioned in upper horizontal plane.As shown in this example, other elements being confirmed as less important interface element or interface can be rendered as the lower horizontal plane being positioned at interface.Such as, advertisement 314 can be rendered as and seem less (such as than being confirmed as the interface element that user is important, shorter) or recessed, and it is likely to be of tone, fuzzy or less color intensity to make the farther place etc. that element seems to be positioned at display screen.In various embodiments, interface can be at least somewhat animated so that important or almost some element unessential can adjust outward appearance lentamente in case make that interface element seem the orientation along with equipment and/or user relative position change and in interface " forward " or " backward " mobile.
According to various embodiments, the importance of interface element can be based at least partially on combination or other information of user profiles, page profile, user profiles and page profile.Such as, user profiles can include the information checking preference, browsing histories of instruction user and/or other personal information being associated with user.User can have multiple profile, such as profile, online social network profile and other profiles that operating system profile is associated with one or more application programs, and include the information in these profiles any and can be used to determine important or relevant content.Such as, user profiles can be used to determine user's interest to physical culture, health, finance, news etc., then can be used to determine that some interface elements are for the relative size (such as, extruding from interface) of other interface elements, position, layout and/or outward appearance.Such as, if it is determined that the specific link on interface links with related content, then link can seem to move more forward (such as, it appears that higher) than other interface elements.In another example, if it is determined that specific article is relevant, so the text of that section of article can be positioned and/or show in the 3 d space so that text is the principal focal point at interface, and therefore owing to text is more readable for relative size and the position of other elements.So, be confirmed as be for a user relevant advertisement, image, link, article text or any interface element can based on determined dependency " forward " or " backward " location.
In other embodiments various, page profile can include the information indicating the importance of various interface element, and this information can be used to the change in orientation along with equipment and/or in the 3 d space user the relative position of equipment is determined to the 3D outward appearance (such as, highly, position etc.) of interface element.Such as, the outward appearance of interface element can be pre-determined by the developer at interface, the supplier of equipment or some other entities.Such as, developer or other entities can specify different weights for element, and wherein weight is the instruction of element dependency, and weight can be used at least one in the one or more height in adjustment element, color, outward appearance.Therefore, various embodiments allow interface element (such as, image, text, advertisement etc.) be based at least partially on element when presenting those interfaces weight highlighted, such as the position based on the orientation of computing equipment and the head of user or eyes, highlighted by the front of the display screen in making element " forward " present towards the quasi-three-dimensional on 3D display or two dimension (2D) display element.In various situations, weight can based on the quantity of the click received, by the mutual amount etc. of user and element.So, interface element dynamically changes, and the height of any of which interface element can be based at least partially on the interest to specific interface element.Such as, the interface element of more clicks is received it may appear that higher than other interface elements than other interface elements.
In certain embodiments, page profile, system profile, action profile or any Information repositories can include being configured to identify the interface element of a certain type (such as, telephone number, e-mail address etc.) information, wherein based on the type of element, the 3D variable appearance of respective element is more (such as, element can seem the surface of the display screen closer to equipment, element can be emphasised), and one or more action can occur when selecting element or being otherwise mutual with element.According to various embodiments, element can include telephone number, address, URL, image, associated person information, date, event information etc..Action can include causing equipment open application program of mobile phone and call number, equipment is caused to open navigation application program and provide up the direction of address, equipment is caused to open address list to preserve address, equipment is caused to open web browser to navigate to the webpage indicated by URL, cause equipment to open photograph album to preserve image, cause equipment to open address list to preserve associated person information etc..In some cases, user " can cast a glance at " element that identifies (such as, telephone number), and it can be seen that other information, such as the address being associated with telephone number or be shown in the 3D of the telephone number picture representing the owner of telephone number on side.Other elements various (and the action being associated and relevant information) can be shown, and it should be noted that the example provided should not be considered as restrictive.
As described, presenting of interface element can change along with the change in orientation of equipment.This can include tilting, rotate or otherwise change the position of equipment.Fig. 3 (b) illustrates the stacked arrangement of interface element.In this example, computing equipment 300 has been inclined by or has rotated, and wherein such change can cause the outward appearance at equipment changing interface.In this example, after directed change computing equipment 300 being detected, interface 301 is rendered as to be arranged so that interface element is divided into stacked, and wherein interface element seems to be stacked or be otherwise alternately arranged with each other.It should be noted that other inputs can be used to cause the outward appearance of equipment changing interface element.Such input can include voice, touch and/or gesture input.Such as, user can make motion or gesture in the visual field of equipment, and this can cause the outward appearance of interface element shown by equipment changing.It is used as other information, as user held, contacts or the detection of extrusion equipment.Such as, article text 306 element can be stacked on the linkage element for business component 330, the linkage element for physical culture part 332 and the linkage element for calendar portion 334.In various embodiments, the outward appearance (such as position, layout and/or the degree of depth etc.) of interface element can be based at least partially on interface element and determine for importance or the dependency of user, and described importance or dependency are as arranged by the author/developer of the page and/or the combination of the two.In various embodiments, the outward appearance of the shade that equipment adjustable is associated to relevant interface element is to make those elements seem higher in interface and provide 3D outward appearance, because each shade can occur position to move along with visual angle change relative to associated components, such as, the impression of the real shade caused by the element being correlated with is provided.Additionally, interface can present the visual angle from user seems to provide the sidewall of the degree of depth or other elements of interface element, and the scope of these elements and shape can adjust along with the directed change of visual angle and equipment.Other behaviors various also can be used to the outward appearance of simulation 3D behavior and stacking interface element.In this example, advertisement can be present in without on hypographous lower horizontal plane, and the lower section being positioned in link 330,332 and 334 thinks that interface element provides the outward appearance being positioned at lower z-depth.In at least some interface, it is understood that there may be more than three horizontal planes, and the amount of shade, color adjustment and other such aspects can depend, at least partially, on the level that is associated with element.
As described, the type of displayed content and/or the layout of content can be based at least partially on the user profiles of user and determine.As described, user profiles can include following information: the browsing histories of user or other are based on the preference information of user.So, when equipment operates in 3D view mode, the content being confirmed as being correlated with for a user can highlighted or otherwise more show than other guide significantly.Such as, based on the information in the user profiles of user, the link of business component, physical culture part and calendar portion can be shown, because it has been determined that user often accesses these parts.
In various scenarios, can be usefully change the mode how content is shown, as to improve readable, outward appearance and/or the access to content.Such as, in some cases, it may be useful to display content in 3D shape (such as cube), wherein content is displayed on the wall of 3D shape, and user can navigation between the content being displayed on different wall.Such as, as shown in Fig. 4 (a), after the activation of 3D view mode being detected, it is displayed on the content of pages zone that the interface 401 of the display screen 402 of computing equipment 400 can be presented to show on the different walls being split into 3D shape.Content of pages zone can be any one (or part of wall) in the wall of 3D shape.User can mobile content be to check any one content of wall in the 3 d space, and the content wherein shown for corresponding wall can include one or more interface element.As described, interface element can include page title 404, article text 406, at least one image 408, at least one link 410, advertisement 414 and other various other link 416 and other elements.Although it should be noted that and illustrating cube, but content can be present on side or the wall of any 3D shape.
Along with the user of computing equipment tilts, rotates, translates, flicks or otherwise change the relative orientation of equipment, the display of content can be adjusted to provide the view of walls different in wall.Such as, when user is around axle 430 432 slewing counterclockwise of equipment, the rotation of equipment can cause displayed content correspondingly to convert (such as, conversion counterclockwise).So, user can focus on a wall or surface every time, and can along with the movement of equipment is by content conversion to focus.It should be noted that other inputs can be used to cause the outward appearance of equipment changing interface element.Such as, user can make counterclockwise movement or gesture in the visual field of equipment, and this can cause displayed content correspondingly to convert (such as, conversion counterclockwise)., as shown in Fig. 4 (b), slewing counterclockwise (or such as, making gesture counterclockwise) can cause the corresponding rotation of interface element, and wherein advertisement 414, article text 406 and link 410 rotate around interface in a counterclockwise direction.According to various embodiments, the rotation of equipment can cause the orientation of interface element and/or any amount of change of outward appearance.Such as, the rotation of equipment or other changes of orientation can cause equipment to enter reading model or other similar patterns.In such pattern, equipment can pass through the most important part (such as, image, text etc.) that consumes of the page to be placed in the center of the page and cause interface to focus on such content, and other guide can be moved into the edge of display screen.
In various embodiments, equipment can use the information of the head of the visual field such as video camera and user or the position of eyes to determine the current visual angle of user, and visual angle can be used to present interface on other such elements of display screen or computing equipment.Can updating present along with the change at determined visual angle, described change is the result of the movement of user and/or computing equipment.Present available 3D map information (such as one group of layer depth or z horizontal plane) to determine how and to be relative to each other by various interface elements.
Such as, Fig. 5 (a) illustrates the exemplary interfaces on the display screen 502 being displayed on computing equipment 500.In this example, user 540 checks that the conventional 2D of webpage 501 represents.Such as many conventional web page, the region of webpage may be logically divided into zone or region.In this example, webpage includes title 504 and at least one link 510.Equipment may also include at least one video camera 550, and described video camera is configured to catch one or more image in the visual field 542 of video camera, such as the image of user.One or more face and/or sight tracing algorithm can be used to process the image of user, in order to determine user's checking or gaze direction relative to equipment.Gaze direction can be used to determine the region of display screen of computing equipment, interface object or other parts that user is likely to be looking at.Therefore, presenting of interface element can change relative to the change of equipment along with the relative gaze direction of user.Such as, as shown in Fig. 5 (b), the sight of user is changed to generally aiming url interface icon 510 from generally aiming at title 504.According to various embodiments, check that the change in direction can cause the outward appearance of apparatus modifications interface element.Such as, the change of the gaze direction of user can be the instruction that the user of display screen may wish to important content or the region checked.Therefore, can be used for activating 3D view mode based on the change offer of the direction of gaze of user according to the method for various embodiments.Therefore, when user " casts a glance at " or otherwise looks about equipment, interface element in the gaze direction being determined to be in user of equipment, content or other regions can be modified, as so that the 3D presenting interface element represents to allow the other information on the 3D that display the is positioned at interface element edge represented and/or side and/or content.So, cast a glance at display screen so that when checking certain content user, described content can highlighted, emphasize or otherwise amendment to improve the readability of such content or to provide the other information relevant to described content.
Such as, Fig. 6 (a) and 6 (b) illustrate a kind of situation, in described situation, be displayed on the 2D interface element on the interface of computing equipment or other 2D contents are transformed into the 3D interface element that can show other content and/or information.As described, equipment may be in response to equipment and enters 3D view mode to change content.May be in response to multiple different 3D view mode activate event and activate 3D view mode.Such as, may be in response to the change in orientation of the equipment that detects (as tilted, rotate, rocking) and activate 3D view mode.In some cases, the directional transforms being utilized to activate 3D view mode can be determined in advance, and in various embodiments, user can arrange equipment or otherwise by device programming individualized directional transforms to be identified as the action for activating 3D view mode.Various additive method can also be used to activate 3D view mode.Such as, can activating 3D view mode by touch input, wherein user is optional, touch physics and/or user may select interface element or otherwise mutual to activate 3D view mode with the optional interface element of physics and/or user.In other embodiments various, voice can be passed through and/or gesture input activates 3D view mode.In some embodiments, 3D view mode can be activated when equipment Inspection to one or more external sources (such as, 3D view mode activation signal, wireless signal etc.).Such as, when equipment is close to one in external source, equipment can cause 3D view mode to be activated.Exemplary external source can include wireless network, peer-to-peer network, telephone network, RFID signal, mobile equipment or other computing equipments etc..
In response to the change relative to equipment of the directed change of equipment or the gaze direction of user.Fig. 6 (a) illustrates the 2D interface of the profile view 620 of individual.Profile view is displayed on the display screen 602 of computing equipment 600.Interface 601, webpage or the other guide page can show the image 624 of individual at least profile and about the information 622 of individual.In this example, due to the directed change of equipment, 3D view mode can be activated.Such as, equipment is inclined by away from user's (that is, the top of equipment move away from user and the bottom of equipment is moved towards user).As mentioned, when equipment operates in 3D view mode, equipment changes presenting of interface, as to illustrate that the 3D at interface represents.Such as, as shown in Fig. 6 (b), after directed change equipment being detected, the 3D of the image 624 of individual represents and can be presented.In various embodiments, outward appearance unless the context and/or outside arranging, may also provide other content.Additionally content can show along 3D interface element, and as shown on the side of element or edge, wherein additionally content can include text, image, link or other information.In various embodiments, outward appearance unless the context and/or outside arranging, may also provide relevant to interested or useful and/or be confirmed as interested or useful other content in other cases.Additionally content can show along 3D interface element, and as shown in side or the edge of element, wherein additionally content can include text, image, link or other information.
Such as, being present on the 3D of equipment at least one side represented can be the information 626 relevant to individual.Relevant information can include the link to other pages, the information relevant to profile, the image relevant to profile or the information of any other type.If user wants to check the content on the not ipsilateral of interface element, user's tiltable or slewing to cause the view change of interface element to carry out the not ipsilateral of display element., as shown in Fig. 6 (c), in response to reclining device, the not ipsilateral of interface element caused to be shown.In this example, the exposed side of interface element can include information, such as the one group associated picture 628 relevant to profile image 624.Equipment further rotate and/or tilt to cause interface object other side be shown, wherein each side can include information and/or content.As shown in Fig. 6 (d), user has been tilted towards user by equipment so that " casting a glance at " interface element.In such circumstances, relevant to the profile of individual information 628 can be present in inside 3D interface element.As described, information can include the information relevant to profile and/or the such information of other information.It should be noted that interface object can include any amount of side, edge or surface, and different surfaces can make any kind of information be presented on said surface.
As described, in various situations, it might be useful to change the mode how content is shown, as to improve readable, outward appearance and/or the access to content.Such as, in various embodiments, interface element and/or other guide can be present in single page, and user can pass through to select link (such as, indicate the element of content page in a series of content page, object or numeral) from a page navigation to another page, described load linked with link the page being associated.Such as, as shown in Fig. 7 (a), the image 708 of elephant is displayed on interface 701 (such as, webpage), and described interface is present on the display screen 702 of computing equipment 700.In this example, user can by selecting navigation link or element 750,752 to navigate to the different pages of webpage.Such as, user can by selecting navigation link 750 to navigate to the second page of webpage.Alternatively, user can by selecting navigation link 752 to navigate to the page 3 face of webpage.In at least some embodiment, user can activate 3D view mode to change presenting of image, as so that the 3D presenting image represents.3D view mode also can change the mode that user is mutual with content.As described, reclining device (such as, as shown in Fig. 7 (b)) can be passed through and activate 3D view mode.In this example, the relevant page is shown as the stacked arrangement of the page.This provides the visual representation of the linked page.User also can " cast a glance at " such as the page as shown in Fig. 7 (c), and as cast a glance at by reclining device, this can cause the page to be stacked so that the partial view presenting at least page for user.Other layouts are also possible, are stacked such as the page or arrange the layout allowing the partial view of at least content to be positioned on these pages.User can as inputted mutual with the page by touch input or gesture, in order to causes equipment to navigate to the selected page.Such as, user " can touch " in other cases and select the desired page, and after selecting the page, equipment can cause the corresponding page to load.
In various embodiments, other content or information can be shown on other faces of its edge or side or element by interface element.Such as, as shown in Fig. 8 (a), at least one interface object 808 can be shown on interface 801 by the display screen 802 of computing equipment 800.In this example, interface object is the image of elephant.The image presented can be the result of the picture search for elephant or from the image of article or other pages.In this example, the position that the 3D of elephant represents can be adjusted to importance or the dependency of the shown image of visually instruction.Such as, 3D represents and can seem " floating " or otherwise be positioned in the top of other interface objects, thus it is more important than other images or relevant to indicate image to be likely to.In this example, other interface element 860,862 can be from image search query or related content and/or the associated picture being associated in other cases with interface object.In various embodiments, outward appearance can be changed so that display is about the other information of elephant in the side of content or edge.Such as, elephant, side and/or edge changeable shape are selected in response to user to allow to show other information., as shown in Fig. 8 (b), side 812 seems downward-sloping and crosses interface element 860 and 862 further to hold other content.Also by change side and/or edges such as the 3D the adjusting elephant edge represented and/or the height of side, shape, colors.
As described, the direction of the sight of user can be used to determine the region at interface of display screen of computing equipment, interface element or other parts that user is likely to be looking at, and when 3D view mode is activated, webpage or the outward appearance in the region of application program that user checks are likely to change.Such as, the exemplary interfaces that Fig. 9 (a) illustrates on the display screen 902 of computing equipment 900 shows.In this example, equipment can include at least one video camera 950, and described video camera is configured to catch one or more image in the visual field 942 of video camera, such as the image of user.One or more face and/or sight tracing algorithm can be used to process the image of user, in order to determine user's checking or gaze direction relative to equipment.Gaze direction can be used to determine the region of display screen of computing equipment, interface object or other parts that user is likely to be looking at.According to various embodiments, presenting of interface element can change relative to the change of equipment along with the relative gaze direction of user.Such as, when checking the webpage of routine, the other guide of the content that user checks and webpage (as be in being likely to less important content on periphery) has identical amount of zoom.According to various embodiments, the content that user checks can be modified to bigger than surrounding content, checking described content such as use magnifier.So, when user checks that content, content can be amplified.Therefore, some regions of webpage can seem to amplify, and other regions of described content seem to reduce.Can amplify with different and independent speed and reduce described content.This advantageouslys allow for the global and local view of simultaneously content.
Such as, as shown in Fig. 9 (a), their sight is focused on the region 916 of webpage by user, and compared with other regions of webpage (edge 912 and 914 such as webpage), the region of focusing seems exaggerated.The edge of webpage can be the other guide on webpage, the content relevant to focal zone and/or any other content.User can carry out navigation website by changing the focus of their sight, and along with user changes their sight, outward appearance can be amplified or otherwise be changed to the content that user focuses on to reflect that it is focused, and other guide can reduce.Such as, as shown in Fig. 9 (b), along with the sight of user is converted into the top of the page, be positioned at the content at the top of the page can be exaggerated or otherwise by shadowed, highlight, animation or other changes more apparent and emphasize.Therefore, according to various embodiments, along with user's " movement " (such as, watch attentively around the page) between area-of-interest, institute's focusing block can as adjusted outward appearance to maximize the content of focal zone by amplification.So, user only can carry out navigation page by changing the focus of their sight, as shown in Fig. 9 (c).In this example, along with the sight of user is converted into the top left side of the page, the content on this position is exaggerated or is otherwise emphasised.It should be noted that and can emphasize content in other manners, and content is amplified the general thinking being used to that changed content outward appearance is described.Such as, as passed through to amplify some part of screen and do not amplify other parts, emphasized content is flexible, inclination or otherwise around shown screen warpage.Other effects are also possible, and as changed the screen resolution of focal zone, this adjustable adapts to the definition of things of display screen, size and amount.
In various embodiments, when tube apparatus does not tilt, rotates or equipment is determined when user watches attentively on the direction of metadata, and equipment can show the metadata being associated with each object on the page.In this situation, the outward appearance of interface element can remain unchanged, and replaces, and the information being associated with element and/or the page is as being emphasised by the 3D degree of depth of increase information and/or outward appearance, highlight or be otherwise modified.Such as, show that the information of (such as, when light is marked on object) can be a candidate of the information can being shown when equipment operates with 3D view mode with " hovering " pattern under normal circumstances.Other examples of metadata can include final time that the page is updated, to checked the link of article, from the image of article, related data, about the metadata etc. in the context being comprised in the page checked.
In at least some embodiment, 3D view mode can be utilized to activate the 3D of map of website and represent.Such as, Figure 10 illustrates the exemplary interfaces 1002 on the display screen 1001 of computing equipment 1000.In this example, in response to specific displacement equipment, current web page (such as, homepage) is reduced to the interface element of thumbnail size.Such as, equipment is moved away from user the expression of map of website can be caused to be presented.In this example, what connect around current page (such as, homepage) is other interface elements, and other interface elements described represent that related pages, traffic flow and/or homepage and its page at the same level, linking between parent page or subpage frame are connected.Such as, as shown in Figure 10, homepage 1060 is connected to the physical culture page 1010, the finance page 1020, the business page 1030, Calendar Page 1040 and related web page 1050.Connection between webpage can be represented by the line etc. of various thickness (or weight), color.According to various embodiments, the height of thumbnail can be likely to advance to the probability of that destination based on user, and the page wherein with the content that user is most interested in can seem that the page less interested than user is higher.The browsing histories of user or other profile datas can be based at least partially on to determine user's interest to any page in the page.Thumbnail can be disposed in relative 3d space, is wherein confirmed as thumbnail more relevant or interested and is positioned closer to user, and user is likely to thumbnail less interested and is positioned user further away from each other.Tone, shade and other visual characteristics various can be used to present the outward appearance of the degree of depth, distance and orientation.The amount of the flow of certain thumbnail can be illustrated for the weight of the link of that thumbnail by adjustment.Such as, the thumbnail (that is, the page) receiving a large amount of flow can be endowed heavier weight than the page receiving less flow.The quantity of connection, thumbnail, related pages and other information can be based at least partially on the degree of user's convergent-divergent page.Such as, in deflated state, relevant or main thumbnail can together with connecting display between thumbnail.Along with the page be exaggerated, the page relevant to thumbnail can together be associated with such page and thumbnail information (as the content on those pages snapshot such as, active view)) show together.
Figure 11 illustrates the example process for presenting various interface according to various embodiments.Should be appreciated that unless otherwise indicated, otherwise in the scope of each embodiment, for any process discussed herein, it is possible to exist and perform or the other step of executed in parallel, less step or alternative steps by similar or replacement order.In this example, activating 3D view mode 1102 on the computing device, in this example, this includes allowing 3D to show and mutual with interface (such as webpage, content page, application program etc.).Such as, various embodiments allow interface element to be based at least partially on the orientation of equipment when presenting those interfaces and/or user is highlighted relative to the user perspective of equipment, emphasize, animation show or otherwise change outward appearance and/or layout.In some embodiments, no matter when computing equipment is movable, and even at sleep pattern or other such low power states, 3D view mode still can be automatically switched on.In other embodiments, 3D view mode is automatically activated after running application program or accessing interface, or is manually activated after user selects.It is possible with other activation events various.
The image information caught can as elsewhere herein discussed analyzed, in order to determine the head (or other the such features) relative position 1104 relative to computing equipment of user.As discussed, image information can be used to determine initial relative position, and the combination of image information and motion sensor information can be used to determine the change of relative position.Such as, motion sensor data can be used to more New Century Planned Textbook and determine that information can obtain from view data until other position, thus can to the described correction or adjustment determined and make any necessity.For current user perspective relatively, interface can be presented to represent so that the 2D of webpage on the display screen of the equipment of being displayed on 1106.As described, webpage can include one or more interface element, such as title, article text, at least one image, at least one link, advertisement and other links various.When the directed change 1108 of equipment being detected, present the three-dimensional of at least subset of one or more interface element or multilamellar outward appearance or other such aspects 1110.This can include making element being current " forward " so that element seem the quasi-three-dimensional on 3D display or two dimension (2D) display screen in presenting close to the surface of display screen.According to various embodiments, make element can include forward such as adjusting the size of element, shape, shadowed, focus/fuzzy and/or painted.Such as, interface element can seem to be positioned and/or show in the 3 d space, make some interface element (such as, text, image etc.) in the 3D degree of depth, become much larger and/or seem the surface of the display screen closer to computing equipment, and other interface elements (such as, advertisement) " retrogressing " or seem less in the 3D degree of depth.Along with user tilts, rotates or otherwise change the orientation 1112 of equipment, interface element can move forward and backward or otherwise change shape or outward appearance 1114.When input completes or user otherwise completes 3D view mode, element can be moved by " backward " or otherwise be presented with 2D, as by perform adjustment of contrary performed with when element is shown to activity or to the adjustment of replacement present, and be used for presenting of that interface and just can terminate.
Additionally, in some embodiments, some element is made to seem to may be alternatively used for other purposes closer to the ability of screen.Such as, higher priority items (such as, appointment on the horizon or new information) can be present in the higher level face in interface.It is used as additive method, as to make unread message be positioned at read, than, the horizontal plane that message is higher, makes social networks comment than old comment closer to screen etc..Such as those of ordinary skill in the art by clear according to instruction contained herein and suggestion, various interfacial process may utilize discussed herein and suggestion aspect.
According to various embodiments, when user select, clickthrough or otherwise with link alternately so that when navigating to another page, at least subset of equipment animation display interface element on a display screen can be caused to leave from the 3D of the 3D view of the first content page.Such as, the exemplary interfaces that Figure 12 (a) illustrates on the display screen 1202 of computing equipment 1200 shows 1201.What be displayed on the screen is the expression of current page.In this example, the image of elephant it is shown that.Navigate to the animation of another page from current page and may be included in 3d space by the page from the first height 1222 " whereaboutss " 1220 to the second highly 1224.The page is separable into one or more part, region, segmentation or interface element (1230,1232,1234).Hereafter, at least subset of interface element can be shown as advancing at different rates, as shown in Figure 12 (c).Additionally, interface element can be shown in when interface element exits the display screen of computing equipment and rotate in a number of different manners, tilts, rocks or otherwise animation shows.Animation also can illustrate that the 3D of one or more interface elements (1260,1262,1264,1266,1268) of second page occurs on a display screen.The interface element of the alternative first page of interface element of second page is until completing presenting of second page, such as presenting of the second page 1280 shown in Figure 12 (d).In some cases, the animation of the formation of second page can include splicing, merges or otherwise combined by the interface element of second page.Other animations are also possible.In certain embodiments, the speed of appearance or second group of content be used for loading and present persistent period that second page spends can speed Network Based (such as, data cube computation speed).Such as, being connected in the situation of fast network (such as, quickly mobile or network data connects) at equipment, the amount owing to navigating to the time delay of second page from first page is likely to relative brevity, so the animation occurred can quickly occur.Being connected in the situation that slow network connects at equipment, owing to the amount of time delay is likely to relatively long, the animation of the appearance of second page can occur within the longer time period.In any case, such animation can advantageously be used to cover time delay and can be used to amendment or otherwise strengthen the appearance of the page and/or leave when the loading content page.
Figure 13 illustrates the example process for presenting various interface according to various embodiments.In this example, activating 3D view mode 1302 on the computing device, in this example, this allows 3D to show and mutual with interface (such as webpage, content page, application program etc.).As described, no matter when computing equipment is movable, and even at sleep pattern or other such low power states, 3D view mode still can be automatically switched on.In other embodiments, 3D view mode can be automatically activated after running application program or accessing interface, or is manually activated after user selects.It is possible with other activation events various.3D view mode can cause equipment to show one or more interface element at 3d space.Such as, interface element can seem to be positioned and/or show in the 3 d space, make some interface element (such as, text, image etc.) in the 3D degree of depth, become much larger and/or seem the surface of the display screen closer to computing equipment, and other interface elements (such as, advertisement) " retrogressing " or seem less in the 3D degree of depth.Along with user tilts, rotates or otherwise change the orientation of equipment, interface element can move forward and backward or otherwise change shape or outward appearance.In this example, the 3-D view of the first webpage is displayed on the display screen of computing equipment 1304.First page can include multiple interface element, such as image, text, advertisement etc..
When receiving the request 1306 navigating to second page from first page (such as, webpage), equipment can cause at least subset of animation display interface element on a display screen to leave 1308 from the 3D of the 3D view of first content page.Such as, animation may be included in 3d space by first page " whereabouts ", and wherein along with interface element falls, at least subset of interface element can be shown as advancing away from user at different rates.Therefore, interface element can be shown in when interface element exits display screen to rotate with different speed in a number of different manners, rotate or animation shows.Animation also can illustrate that the 3D of one or more interface elements of the second content page of the interface element appearance substituting first page occurs 1310 on a display screen.Hereafter, the expression 1312 of the second webpage can be shown.In various embodiments, the expression of webpage can be that 2D represents, and in other embodiments, expression can be that 3D represents.According to various embodiments, such animation can advantageously be used to cover time delay and can be used to amendment or otherwise strengthen the appearance of the page when loading the page.
Figure 14 illustrates the example process for presenting various interface according to various embodiments.As described, the ability of the 3D attribute (such as outward appearance, action etc.) of the interface element that various embodiments offer appointment is displayed on interface (such as, webpage, application program etc.) or other pages.In this example, receive from user by web displaying request 1402 on the display screen of computing equipment.Webpage can include one or more interface element, such as image, text, link and other interface elements.DOM Document Object Model 1404 can be generated.DOM Document Object Model can comprise the information being used to show at least one interface element on webpage.Document model can be organized by tree structure, and described tree structure includes at least subset of one or more node and those nodes and can be associated with at least one label.
At least subset of interface element also can be associated with label, attribute or other elements, and the 3D cosmetic variation being applied to respective interface element when activating 3D view mode specified by described label, attribute or other elements.The 3D cosmetic variation specified by label can include at least one in the change of the size of at least one element, shape, shadowed effect, focus, painted, position and/or animation.In some cases, the type of the cosmetic variation specified by label can be based at least partially in user profiles or page profile, wherein user profiles includes the web-page histories of instruction user or the information of at least one of user preference, and page profile includes the information that instruction default three-dimensional represents.In some cases, label can be configured to specify multiple three-dimensional appearance to change, and wherein respective labels is based at least partially on the orientation of computing equipment and causes three-dimensional appearance to change.Such as, when equipment is in the first orientation, label may specify the first three-dimensional appearance change, and when equipment is in the second orientation, label may specify the second three dimensional change.
In some embodiments, the developer of such page can use information (such as, label, CSS, JavaScript etc.) to specify 3D cosmetic variation to be applied, and in other embodiments, interface can determine that the 3D cosmetic variation of at least one element being applied to interface specifies 3D cosmetic variation without developer.Such as, interface may utilize the DOM hierarchy, object model or other model hierarchy that include information that to determine how display or otherwise to change interface element outward appearance (such as, as passed through make the element " forward " front towards display screen or make the element front " retrogressing " from display screen) in the 3 d space.
The 2D of webpage represents shown 1406.When the directed change 1408 of equipment being detected, 3-D view pattern is activated 1410, and this can cause equipment to present the 3D of at least subset of one or more interface element or multilamellar outward appearance or other such aspects.As described, various embodiments allow interface element to be based at least partially on the orientation of equipment when presenting those interfaces and/or user is highlighted relative to the user perspective of equipment, emphasize, animation show or otherwise change outward appearance and/or layout.When the change 1410 subsequently of orientation of equipment being detected, 3D cosmetic variation can be applied at least one interface element 1412 specified by label.Hereafter, show that at least one of 3D of webpage represents 1414.In various embodiments, action can be associated with not yet tagged element.Such as, based on the type of element, the outward appearance of element can be changed to seem the surface of the display screen closer to equipment, and one or more action can occur when selecting element or being otherwise mutual with element.According to various embodiments, element can include telephone number, address, URL, image, associated person information, date, event information etc..Action can include causing equipment open application program of mobile phone and call number, equipment is caused to open navigation application program and provide up the direction of address, equipment is caused to open address list to preserve address, equipment is caused to open web page navigation device to navigate to the webpage indicated by URL, cause equipment to open photograph album to preserve image, cause equipment to open address list to preserve associated person information etc..Such as, interface element be telephone number situation in, select telephone number the telephony application of equipment can be caused to call number.3D view that another example can include presenting image and prompting is provided or is configured to when selecting image to preserve or otherwise store other interface elements of image.Another example can include identifying the product on the page, and the 3D presenting product represents, and provide together with product and be configured to preserve product to the button of the expectation list of electronic market or shopping cart or other interface elements and other elements.
In at least some embodiment, computing equipment may utilize one or more video camera or other such sensors to determine the relative direction of user.Such as, Figure 15 (a) illustrates example scenario 1500, the purpose wherein determined for visual angle, and computing equipment 1502 is configured to utilize at least one camera elements 1506 to attempt the feature of location user, such as head or the eyes of user.In this example, the eyes 1504 of user are positioned in the visual field 1508 of the video camera of computing equipment 1502.But, as in elsewhere herein discussed, it is possible to use the eyes of user, pupil, head or may indicate that at least one as the position of other such features at visual angle to determine the visual angle of user.In some embodiments, equipment is likely to search the object held by user or be otherwise associated with user to determine the general visual angle for presenting.In addition, in some embodiments, equipment may utilize the video camera that at least two that having on the equipment of being positioned in be spaced apart sufficiently from is different so that equipment may utilize three-dimensional imaging (or another such method) and determines that one or more feature is in three dimensions relative to the relative position of equipment.Should be understood that the identical of the scope that can there is on other positions various of the equipment of being positioned at and be in various embodiment or different types of other image-forming component.
The information of the zoom level residing for information that the software performing (or otherwise with computing device communication) on the computing device can obtain the angular views such as video camera, currently catching and any other such relevant information, thus software can be allowed to determine, at least one of eyes of user is relative to the general direction 1510 of video camera.In various embodiments, directional information will enough provide sufficient view angle dependency to present.But, at least some embodiment, it may also be desirable to determine the distance of user to provide and more one to make peace and present accurately.In some embodiments, such as ultrasound examination, feature sizes analysis, can be used to assist position to determine by method or other such distance measurement methods of the Luminance Analysis of active illumination.In other embodiments, the distance that the second video camera can be used to allow to be undertaken by three-dimensional imaging is determined.Once determine the direction vector from least two image capturing component for given feature, so that it may determine the cross point of those vectors, this corresponds to individual features substantially relative position in three dimensions, as known for discrepancy mappings and other such processes.
Further illustrating such illustrative methods, Figure 15 (b) illustrates the head of the user that the video camera 1506 of Figure 15 (a) can be used to catch and the example images 1520 of eyes.One or more image analysis algorithm can be used to analyze image to perform pattern recognition, form discrimination or another such process to identify feature interested, such as the eyes of user.Identifying that the method for the feature in image is known in the art and will not be discussed herein, such method can include feature detection, facial feature extraction, character recognition, stereoscopic vision sensing, character distinguish, attributes estimation or RBF (RBF) analyze method.As shown in this example, the eyes possibility of user can be positioned in the image information caught.At least some algorithm can determine the approximate location for every eyes or region 1522,1524 or at least approximate location 1528 of the head of user, and at least one in those in which position or region is used to visual angle and determines.But, depend on that such as the factor of the distance between desired level and user and the equipment of sensitivity, such information can affect the accuracy of angle-determining.Method according to various embodiments may utilize following facts: human brain combination and processing from the information of eyes to provide " single " visual angle.Therefore, software can be attempted determining between the eyes of user intermediate point 1526 is so that for the visual angle of user.The various additive methods discussed such as elsewhere herein can also be used.Once it is determined that the relative position in image information, equipment just can use such as the visual field of video camera, video camera relative to the position of equipment, the information of the zoom level of video camera and other such information to determine the relative direction of user, wherein that relative direction for visual angle in presenting interface.
But, when using video camera to carry out trace location, accuracy frame per second by video camera at least in part is limited.Additionally, image needs to take some time to process, postpone so that would be likely to occur some in determining.Owing to the directed change of equipment is likely to relatively quickly occur, so at least some embodiment, it may be desirable to strengthen the accuracy that visual angle is determined.In some embodiments, the sensor of computing equipment or other such elements can be used to determine the motion of computing equipment, and this can help to adjust visual angle and determines.Sensor can be able to provide any applicable sensor of the information rotationally and/or translationally about equipment, as included accelerometer, inertial sensor, electronic gyroscope, electronic compass etc..
Such as, Figure 16 (a) illustrates " top view " 1600 of computing equipment 1602, and described computing equipment may operate to the image of the visual angle 1608 IT object 1604 (such as, the head of user) of the video camera 1610 at computing equipment.In this example, computing equipment 1602 includes at least one orientation or element is determined in rotation, and such as electronic compass or electronic gyroscope, described element can be oriented in two dimension or three-dimensional relative to the first of equipment and determine reference frame 1606.In at least some embodiment, electronic compass is likely to be used to determine the axle of reference frame 1606, as may correspond to north direction etc..In other embodiments, the available parts such as compass of parts such as electronic gyroscope are calibrated periodically, but can replace and determine the change that the orientation along three rotating shafts elapses in time.In the scope of various embodiments, it is used as determining the various additive methods of the directed change along one, two or three rotating shaft.
The time residing for the first image can caught by the video camera 1610 of computing equipment 1602 or determine the first reference frame 1606 or orientation close to the described time.In some embodiments, the input of image can be caught or another such action triggers and determines by receiving, but in other embodiments, can based on the type of electronic gyroscope and/or configuration termly (as per second for several times) update reference frame and/or directed information.Gyroscope can also be any applicable electronic gyroscope parts, as being used in the conventional MEMS gyroscope in various consumer device.For realizing such gyroscope and therefrom obtaining the method for change in orientation and be known in the art, and therefore will not be discussed herein.
Figure 16 (b) illustrates the second top view 1610 after the change in orientation of computing equipment 1602.Electronic gyroscope (or other such parts or embedded type sensors) can detect orientation change, in this example, described change corresponding to angle 1612 in the plane of figure relative to the change of reference frame.Gyroscope can present the information about directed change in any suitable fashion, as angle or radian once, twice or the change of three degree (such as, Δ x, Δ y, Δ z), the percentage ratio change etc. of the angle of pitch (pitch), roll angle (roll) and yaw angle (yaw).In this example, directed change is confirmed as the rotation 1612 of the given angle amount around single axle.As indicated, this causes object 1604 to be moved into the right hand edge in the visual field 1608 of video camera 1610.In at least some embodiment, gyroscope is likely to be not enough to accurately provide the exact amount rotated, but approximation or the estimated value of rotation amount, described approximation or estimated value can be provided can be used to constriction search volume and be easy to position in the picture corresponding objects.Additionally, at least some embodiment, with from video camera available compared with, information can provide faster adjustment or the prediction of relative position.Similar method can be used for translation, but translate in the image caught the impact on object than angle change to affect obvious degree much smaller so that at least some embodiment, image information is likely to adequate remedy translation change.
Figure 17 illustrates front view and the rearview of the exemplary electronic computing equipment 1700 that can use according to various embodiments.Although being illustrated that portable computing device (such as, smart mobile phone, E-book reader or tablet PC), it should be appreciated that be able to receive that and process any equipment of input and can use according to various embodiments discussed herein.Equipment can include such as desk computer, notebook, E-book reader, personal digital assistant, cell phone, video game console or controller, TV set-top box and portable electronic device and other equipment.
In this example, computing equipment 1700 has display screen 1702 (such as, LCD element), and described display screen may operate to be shown to information or picture material one or more users or the viewer of equipment.The display screen of some embodiments displays information to the viewer towards display screen (such as, be positioned at computing equipment with display screen phase the same side).In this example, computing equipment can include one or more image-forming component, in this example, the one or more image-forming component includes two image capturing component 1704 in the front of the equipment that is positioned at and is positioned at least one image capturing component 1710 at rear of equipment.However, it should be understood that image capturing component also or can alternatively be placed on side or the turning of equipment, and can there is any suitable number of similar or different types of capture element.Each image capturing component 1704 and 1710 can be such as video camera, charge (CCD), motion detection sensor or infrared sensor, or other image capture techniques.
As discussed, equipment can use the image caught from image-forming component 1704 and 1710 (such as, rest image or video image) to generate the three-dimensional simulation (such as, for the virtual reality of the surrounding of display on the display element of equipment) of surrounding.Additionally, the available output of at least one from image capturing component 1704 and 1710 of equipment, in order to assist to determine the position of user and/or directed and assist to distinguish neighbouring people, object or position.Such as, if user is holding equipment, then the image information caught can analyzed (such as, using about the map information of specific region) to determine approximate location and/or the orientation of user.The image information caught also can be analyzed to distinguish neighbouring people, object or position (such as, by mating parameter from map information or element).
Although not needing audio components at least some equipment, but computing equipment may also include at least one mike or can catch other audio capturing elements or other such parts of voice data, music that the described voice data people such as the word said by the user of equipment, close to equipment hums or the audio frequency sent by neighbouring speaker.There are three mikes in this example, mike 1708 is positioned at front side, and a mike 1712 is located behind, and mike 1706 is on or close to the top of equipment or side.In some equipment, it is possible to only exist a mike, and in other equipment, it is possible to there is at least one mike on each side and/or turning of equipment or in other suitable positions.
In this example, equipment 1700 also includes one or more orientation or element 1718 is determined in position, and it may operate to provide such as the information of the position of equipment, direction, motion or orientation.These elements can include such as accelerometer, inertial sensor, electronic gyroscope and electronic compass.
Example devices also includes at least one communication agency 1714, as included at least one the wired or wireless parts that may operate to communicate with one or more electronic equipments.Equipment also includes power system 1716, and as included may operate to by conventional plug-in type method or the battery that recharged by additive method, described additive method is as by carrying out condenser type charging close to charging panel or other such equipment.In the scope of various embodiments, other elements various and/or combination are also possible.
Figure 18 illustrates one group of basic element of character in the electronic computing device 1800 such as reference Figure 17 equipment 1700 described.In this example, equipment includes at least one processing unit 1802, and described processing unit is for performing the instruction being storable in storage device or element 1804.As those of ordinary skill in the art are readily apparent that, equipment can include polytype memorizer, data storage device or computer-readable medium, as being used for the first data storage device of the programmed instruction performed by one or more processing units 1802, same memorizer or individually storage device may be used for image or data, removable memory may be used for and other collaborative share information, and any amount of communication means may be used for sharing with other equipment.
Equipment is typically included certain type of display element 1806, such as touch screen, electric ink (e ink), Organic Light Emitting Diode (OLED) or liquid crystal display (LCD), but the equipment such as portable electronic device transmits information possibly through other means (as passed through audio tweeter).
As discussed, in many embodiments, equipment will include at least one image-forming component 1808, if catching the image of surrounding and the user near equipment, people or object can carrying out one or more video cameras of imaging.Image capturing component can include any applicable technology to catch the image of user when user operation equipment, and described technology is as having the ccd image capture element of enough resolution, focusing range and viewable regions.Utilize computing equipment, to use camera elements to carry out catching the method for image also well-known in the art and will not be discussed herein.Should be understood that single image, multiple image, periodically imaging, consecutive image can be used to catch, image streaming transmission etc. performs image capturing.Additionally, equipment can include the ability starting and/or stopping image capturing, such as the situation when receiving order from user, application program or other equipment.
Exemplary computer device 1800 also includes at least one orientation of orientation and/or the movement that can determine and/or detect equipment and determines element 1810.Such element can include such as may operate to detection equipment 1800 movement (such as, in rotary moving, angle displacement, inclination, position, orientation, moving along nonlinear path) accelerometer or gyroscope.Orientation determines that element may also include electronics or digital compass, described electronics or digital compass may indicate that equipment be determined pointed (such as relative to main shaft or other such towards) direction (such as north or south).
As discussed, in many embodiments, equipment will include at least one setting element 1812, in order to determine the position of equipment (or user of equipment).Setting element can include (include) or include (comprise) GPS or element is determined in similar position, its dependent coordinate that may operate to determine the position of equipment.As mentioned above, setting element can include WAP, base station etc., such that it is able to signal is carried out triangulation to determine the position of equipment by broadcast position information or permission.Other setting elements can include QR code, bar code, RFID tag, NFC label etc., and it allows equipment Inspection and receives positional information or identifier, to allow equipment to obtain positional information (such as, by identifier is mapped to correspondence position).Various embodiments any applicable combination can include one or more such element.
As mentioned above, some embodiments use one or more elements so that the position of tracing equipment.Determining that the initial position of equipment is (such as, GPS is used to determine) after, by using one or more element, or in some cases by using one or more orientations as mentioned above to determine element, or its combination, the equipment of some embodiments can keep the tracking of the position to equipment.Such as institute it should be understood that, for determining that the algorithm of position and/or orientation or mechanism can depend, at least partially, on the selection to the element that can be used for equipment.
Example devices also includes one or more radio parts 1814, and described radio parts may operate to and the one or more electronic equipment communications in the communication range of specific radio channel.Wireless channel can be used to permission equipment and carry out any applicable channel of radio communication, such as bluetooth channel, honeycomb channel, NFC or Wi-Fi channel.The equipment of should be understood that can have one or more Conventional wireline communication known in the art and connect.
Equipment also includes power system 1816, and as included may operate to by conventional plug-in type method or the battery that recharged by additive method, described additive method is as by carrying out condenser type charging close to charging panel or other such equipment.In the scope of various embodiments, other elements various and/or combination are also possible.
In some embodiments, equipment can include at least one other input equipment 1818, and it is able to receive that the conventional input from user.This conventional input can include such as button, touch pad, touch screen, steering wheel, stick, keyboard, mouse, keypad or user can by for any other such equipment or the element to equipment input order.In some embodiments, these I/O equipment even again may be by Radio infrared or bluetooth or other chains fetch connection.Some equipment may also include mike or accept other audio capturing elements of voice or other voice commands.Such as, equipment can not include any button completely, and can be controlled only by the combination of video commands and voice command so that user can control described equipment when necessarily contact arrangement.
In some embodiments, equipment can include activating and/or disable detection and/or the ability of command mode, as when receiving order from user or application program or retrying so that situation when determining audio frequency input or video input etc..In some embodiments, equipment can include infrared detector or motion detector, for instance, it can be utilized to activate one or more detection pattern.Such as, when not having user in room, equipment possibility will not attempt detection equipment or with equipment communication.Such as, if infrared detector is (namely, the detector with single pixel resolution of detection state change) detect that user enters room, so equipment can activate detection or control model, make the equipment can be ready when user needs, and in user not nearby time electric power saving and resource.
According to various embodiments, computing equipment can include photodetector, and described photodetector can determine whether equipment is exposed to environment light or whether is in relative or complete darkness.Such element can be useful in many aspects.In some conventional equipment, when mobile phone is put into user's face (causing photodetector to be generally shielded environment light) by user to use photodetector to determine, this can trigger action, display element (because when equipment is put on user's ear, user cannot see display element) such as temporary close phone.Photodetector can be combined use from the information of other elements, in order to adjust equipment function.Such as, if equipment can not detect user check that position and user do not have gripping device, but equipment is exposed to environment light, then equipment may determine that, equipment is likely to be put down by user and be likely to close display element and disable certain function.If what equipment can not detect user checks position, user does not have gripping device and equipment to be no longer exposed to environment light, so equipment may determine that, equipment has been placed in user and is likely in unapproachable sack or other compartments and is therefore likely to close or disabling possible available other feature in other cases.In some embodiments, in order to activate certain function of equipment, user must look at equipment, hold equipment or equipment of taking out is placed under light.In other embodiments, equipment can include the display element that can operate in different mode, described pattern such as reflective-mode (for bright situation) and emission mode (for dark situation).Based on the light detected, equipment can change pattern.
When using mike, equipment can disable other generally unrelated with power saving features.Such as, equipment can use voice recognition to determine the people (such as child) near equipment, and can based on the described feature determined and disable or enable as internet access or father and mother control.Additionally, equipment can analyze the noise that records to attempt determining environment, if whether equipment on Che Nei or aircraft, and described determine decision can be helped to enable/disable which feature or based on which action other inputs take.If no matter use voice recognition, then word can be used as input is the word directly equipment said or passes through to talk with indirectly picked word.Such as, if equipment determines that it is in car, in the face of user and detect such as the word of " starving " or " eating ", then equipment is likely to open display element the display information etc. about neighbouring restaurant.For privacy and other such purposes, user can select to close voice record and dialogue monitoring.
In some of above example, the action that equipment is taked relates to the purpose disabling certain function to realize reducing power consumption.However, it should be understood that action may correspond to other functions that equipment can be used to adjust similar or other potential problems.Such as, some function (as requested webpage content, on a hard disk drive search content and open various application program) be likely to need a certain amount of time to complete.For the equipment of resource-constrained or the equipment that is widely used, simultaneous many such operations can cause equipment slack-off or even lock, thus may result in poor efficiency, making Consumer's Experience degradation and being likely to use more electric power.
In order to solve at least some in these and other such problems, method according to various embodiments is possible with the information such as user's gaze direction and activates the resource used possibly, in order to launch the needs processing capacity, memory space and other such resources.
In some embodiments, equipment can have enough disposal abilities, and image-forming component and the parser that is associated can be enough sensitive in be based solely on captured images to distinguish the motion of equipment, the motion of user's head, the motion of eyes of user and other such motions.In other embodiments, as when being likely to need process to utilize fairly simple image-forming component and the method for analysis, it may be necessary to current at least one the directed orientation including the equipment that can determine determines element.In an example, at least one orientation determines that element is able to detect such as the three-dimensional position of equipment and the mobile range of equipment and direction and vibration, at least one single shaft of factor rocked etc. or multi-axial accelerometer.Use the element such as accelerometer to determine that the orientation of equipment or the method for movement are also known in this area and will no longer discussed detail herein.In the scope of various embodiments, can use equally and determine element for other elements detecting orientation and/or movement to be used as orientation.When using from the input of accelerometer or like together with the input from video camera, relative movement can be explained more accurately, thus allowing input and/or the less image analysis algorithm of complexity more accurately.
Such as, when the image-forming component using computing equipment detects the motion of equipment and/or user, computing equipment can use the background in image to determine movement.Such as, if equipment is taken to user by user with fixing orientation (such as, distance, angle etc.), and user changes the orientation to surrounding, then individually analyze the change that the image of user is directed by equipment not detected.On the contrary, in some embodiments, computing equipment can still be by the change distinguishing user's background video behind to detect the movement of equipment.If it is therefoie, for example, object is (such as, window, picture, trees, shrub, building, automobile etc.) move to the left or to the right in the picture, so equipment it is determined that, described equipment has changed orientation, even if equipment does not change relative to the orientation of user.In other embodiments, equipment can detect that user has moved relative to equipment and correspondingly adjusted.Such as, if user is relative to device left or their head of being tilted to the right, then the content presented on display element is likely to similarly tilt so that content and user keep identical orientation.
As discussed, it is possible in various environment, realize distinct methods according to described embodiment.Such as, Figure 19 illustrates the example of the environment 1900 for realizing the aspect according to various embodiments.As will be appreciated, although using network environment for explanatory purposes, but varying environment can be used in due course to realize various embodiment.System includes electronic client device 1918,1920,1922 and 1924, and it can include may operate to sending on applicable network 1904 and receive request, message or information and information sends back any applicable equipment of user of equipment.The example of such client device includes personal computer, mobile phone, hand-held message transmission device, laptop computer, Set Top Box, personal digital assistant, E-book reader etc..Network can include any applicable network, including in-house network, the Internet, Cellular Networks, LAN or any other such network or its combination.Network can be " propelling movement type " network, " extending type " network or its combination.In " propelling movement type " network, one or more in server are pushed out data to client device.In " extending type " network, one or more in server transmit data to client device after client device requests data.Parts used by such system can depend, at least partially, on the type of selected network and/or environment.For being well-known by the agreement of such network service and parts and will not be discussed herein.Can be realized by wired connection or wireless connections and its combination by the communication of network.In this example, network includes the Internet, because environment includes for receiving request and providing the web page server 1906 of content in response to described request, but for other networks, the alternate device meeting similar purpose can be used, as those of ordinary skill in the art are readily apparent that.
Illustrative environment includes at least one application server 1908 and data back 1910.It should be understood that, can there are several application servers, layer or other elements, process or the parts that can be chained up or otherwise configure, these application servers, layer or other elements, process or parts can perform such as to obtain from applicable data storage the task of data alternately.As used herein, term " data back " refers to storage, accesses and retrieve any equipment or the equipment combination of data, and described equipment or equipment combination can include any combination and any amount of data server, data base, data storage device and data storage medium in any standard environment, distributed environment or clustered environment.Application server 1908 can include any applicable hardware and software, is used for integrating with data storage 1910 on demand performing each side of the one or more application programs for client device and most of data access of handle applications and service logic.Application server is collaborative with data storage provides access control service, and the content that will send user to can be generated, such as text, picture, audio frequency and/or video, in this example, described content can be supplied to user by web page server 1906 with the form of HTML, XML or another structured language being suitable for.Content delivery between disposal and client device 1918,1920,1922 and 1924 and the application server 1908 of all requests and response can be disposed by web page server 1906.Structured code discussed herein should be understood that web page server and application server are dispensable, and is merely exemplary parts, because can perform on any applicable equipment that such as elsewhere herein is discussed or main frame.
Data storage 1910 can include multiple independent tables of data, data base or other data storage mechanism and medium, for the data that storage is relevant to particular aspects.Such as, it is shown that the data back gone out includes the mechanism for storing content (such as, generate data) 1912 and user profile 1916, and described mechanism can be used to as generating side offer content.Data back is also depicted as including for storing daily record or the mechanism of session data 1914.It should be understood that, would be likely to occur other aspects many being likely to need to be stored in data storage, such as page image information and access right information, described aspect can be stored in any mechanism in mechanism listed above in due course or be stored in the other mechanism in data back 1919.Data back 1919 can be operated by the logic associated with it, in order to receives instruction from application server 1908, and obtains in response to described instruction, update or otherwise process data.In an example, user can submit searching request to for the project of a certain type.In this case, data back is likely to access user profile to verify the identity of user, and addressable catalog detail information is to obtain the information of the project about described type.Then can by information as returned to user with the form of the results list on webpage, user can check described list by the browser in any one of subscriber equipment 1918,1920,1922 and 1924.The information of specific project interested can be viewed in the private pages of browser or window.
Each server is typically included operating system, described operating system provides the general management for described server and the executable program instructions of operation, and each server is typically included the computer-readable medium of storage instruction, described instruction can make server perform its expectation function when being performed by the processor of server.The mode that is adapted for carrying out of operating system and the general utility functions of server are well-known or commercially available, and are prone to be realized by those of ordinary skill in the art, realize especially in accordance with disclosing herein.
In one embodiment, environment be utilize by communication linkage, use one or more computer network or be directly connected to the distributed computing environment of several computer systems and the parts interconnected.However, it will be understood by one of ordinary skill in the art that, this system successfully can operate equally in having more less than the parts shown in Figure 19 or more multipart system.Therefore, the description of the system 1900 in Figure 19 substantially should be regarded as illustrative, and is not intended to the scope of the present disclosure.
Can realizing various embodiment further in broad range of operating environment, described environment can include one or more subscriber computer in some cases or can be used to operate the computing equipment of any application program in multiple application program.User or client device can include any computer in multiple general purpose personal computer, such as desk computer or the laptop computer of operation standard operating system, and run removable software and multiple network connection protocol and the cellular device of messaging protocol, wireless device and portable equipment can be supported.This system may also comprise multiple work station, and described work station runs various commercially available operating systems and for any application program in other known applications of the purpose such as exploitation and data base administration.These equipment may also comprise other electronic equipments, such as virtual terminal, thin client, games system and other equipment that can pass through network service.
Most of embodiments utilize at least one network that those skilled in the art are familiar with, and any agreement in the various commercially available agreement of described Web vector graphic supports communication, described agreement such as TCP/IP, OSI, FTP, UpnP, NFS, CIFS and AppleTalk.Network can be any combination of such as LAN, wide area network, Virtual Private Network, the Internet, in-house network, extranet, public switch telephone network, infrared network, wireless network and above-mentioned network.
In the embodiment utilizing web page server, web page server can run any application program in various server or mid-tier application, and described server includes http server, ftp server, CGI server, data server, java server and commercial application server.Server can also perform program or script in response to the request from subscriber equipment, as pass through perform one or more Web page application programs, described Web page application program can be implemented as with any programming language (asC, C# or C++) or one or more scripts of writing of any script (combination such as Perl, Python or TCL and above-mentioned script) or program.Server may also include database server, include but not limited to available commercially fromWithDatabase server.
Environment can include multiple data back as discussed above and other memorizeies and storage medium.These environment can reside in various position, as on the storage medium of one or more computers local (and/or residing in one or more computer), or away from any or all computer in the computer on network.In one group of particular, information may reside within the storage area network (SAN) that those skilled in the art are familiar with.Similarly, the file for performing to belong to any necessity of the function of computer, server or other network equipments can be locally or remotely stored in due course.When system includes computerized equipment, each such equipment can include the hardware element that can be carried out electric coupling by bus, described element includes such as at least one central processing unit (CPU), at least one input equipment (such as, mouse, keyboard, controller, touch-sensitive display element or keypad) and at least one outut device (such as, display device, printer or speaker).This system may also comprise one or more storage device, such as disc driver, light storage device and solid storage device (such as random access memory (RAM) or read only memory (ROM)), and removable media device, storage card, flash card etc..
Such equipment may also comprise computer-readable storage media reader, communication equipment (such as modem, network interface card (wirelessly or non-wirelessly), infrared communication device etc.) and working storage as above.Computer-readable storage media reader can be connected or be configured to receive computer-readable recording medium with computer-readable recording medium, and computer-readable recording medium represents long-range, local, fixing and/or movable memory equipment and for temporarily and/or more permanently containing, store, transmit and retrieve the storage medium of computer-readable information.System and various equipment are generally also positioned at include in multiple software applications of at least one working memory devices, module, service or other elements, and it includes operating system and application program, such as client application or web browser.Should be understood that alternate embodiment can have the multiple variant different from the embodiment above.Such as, it is possible to use custom hardware, and/or particular element can hardware, software (including portable software, such as small routine) or both in realize.Furthermore, it is possible to adopt and the connection of other computing equipments such as network input-output apparatus.
Storage medium and computer-readable medium containing code or the part of code can include any applicable medium that is known in the art or that used, it includes storage medium and communication media, as but be not limited to use in storage and/or transmission information (such as computer-readable instruction, data structure, program module or other data) any method or technology in the Volatile media that realizes and non-volatile media, packaged type medium and irremovable formula medium, including RAM, ROM, EEPROM, flash memory or other memory technologies, CD-ROM, digital versatile disc (DVD) or other optical memory, magnetic holder, tape, disk memory or other magnetic storage apparatus, or can be used to store information needed and be available for any other medium that system equipment accesses.Based on disclosure provided in this article and instruction, the art those of ordinary skill will be appreciated by realizing other modes of various embodiment and/or method.
Therefore, the specification and drawings should be understood in descriptive sense rather than restrictive, sense.However, it would be apparent that: when the broader spirit and scope without departing from the present invention such as set forth in detail in the claims, it is possible to it is made various amendment and change.
The embodiment of the disclosure can describe in view of following clause:
1. a computing equipment, comprising:
Display screen;
At least one computing equipment processor;And
Memory devices, it includes the instruction allowing described computing equipment to carry out following operation when being performed by least one computing equipment processor described:
Receive web displaying request on the described display screen of described computing equipment from user;
Generate DOM Document Object Model, described DOM Document Object Model comprises the information being used to be shown in by least one object on described webpage, at least one object described is associated with label, and the three-dimensional appearance change being applied at least one object described when activating 3-D view pattern on described display screen specified by described label;
Show that the two-dimensional representation of described webpage, described webpage include at least one object described;
Detect the directed change of the described computing equipment described user relative to described computing equipment;
Described change in response to the described orientation of described computing equipment activates described 3-D view pattern;And
By described three-dimensional appearance change application at least one object described in specified by described label.
2. the computing equipment as described in clause 1, the described three-dimensional appearance wherein specified by described label change includes at least one in the change of size, shape, shadowed effect, focus, painted, position or animation.
3. the computing equipment as described in clause 1, wherein said DOM Document Object Model also includes the one or more attributes specifying the described three-dimensional appearance change being applied at least one object described.
4. a computer implemented method, comprising:
Process object model to be shown in content page by least one object of multiple objects, at least one object described is associated with attribute, and described attribute specifies the three-dimensional appearance change being applied at least one object described when activating 3-D view pattern on described display screen;
The two dimension view of described content page is shown on the display element of computing equipment;And
In response to activating described 3-D view pattern, it is based at least partially on described attribute by described three-dimensional appearance change application at least one object described.
5. the computer implemented method as described in clause 4, it also includes:
Detect the directed change of the described computing equipment relative position relative to the user of described computing equipment;
Described change in response to the described orientation of described computing equipment activates the described 3-D view pattern for described content page;And
Being based at least partially on the three dimensional representation of at least one object described in described attribute display when showing described content page in 3-D view pattern, the described three dimensional representation of at least one object wherein said is shown as seeming than at least one other object surface closer to described display screen.
6. the computer implemented method as described in clause 4, wherein said attribute by the developer of described content page at least one or by be configured to present described content page application program specify.
7. the computer implemented method as described in clause 4, wherein said attribute uses at least one in html tag, Cascading Style Sheet or Javascript to specify.
8. the computer implemented method as described in clause 4, wherein said attribute is configured to specify multiple three-dimensional appearance to change, and respective attributes is based at least partially on the orientation of described computing equipment causes described three-dimensional appearance to change.
9. the computer implemented method as described in clause 4, wherein said object model is organized with the tree structure including one or more node, and at least subset of its interior joint is associated with at least one attribute.
10. the computer implemented method as described in clause 4, wherein said attribute is based at least partially in user profiles or page profile, wherein said user profiles includes the web-page histories of instruction user or the information of at least one of user preference, and described page profile includes the information that instruction default three-dimensional represents.
11. the computer implemented method as described in clause 4, it also includes:
Being based at least partially on described attribute and present described content page to seem to be stacked on one or more layer, at least the first object of wherein said multiple objects seems to be positioned at least the second object of the plurality of object at a distance above.
12. the computer implemented method as described in clause 4, it also includes:
Being based at least partially on described attribute and be presented at least side of 3D shape by least one object of the plurality of object, at least one object described is rendered as and appears to be in described 3D shape.
13. the computer implemented method as described in clause 12, it also includes:
At least one sensor using described computing equipment detects the change of at least one in the orientation of described computing equipment or position;And
It is based at least partially on described at least one object described in described Attribute tuning to present so that at least one object described seems that the second side to described 3D shape is moved in the first side from described 3D shape.
14. the computer implemented method as described in clause 4, it also includes:
Determine the directed change subsequently of described computing equipment;And
It is based at least partially on described attribute animation and shows the rotation of at least one object described to show the second side of at least one object described;And
The second group content relevant at least one object described is shown on described second side of at least one object described.
15. the computer implemented method as described in clause 4, it also includes:
Determine the gaze direction of the user of described computing equipment;
It is based at least partially on the position that the gaze direction of described user is determined on display element;And
It is based at least partially on the outward appearance that described Attribute tuning is positioned at the object of described position, in order to seem to be exaggerated relative to other object being displayed on described display element.
16. the computer implemented method as described in clause 4, it also includes:
Determine the object type of of the plurality of object;
Determine the action that the one with the plurality of object is associated;
It is based at least partially on the 3D outward appearance that described object type presents the one of the plurality of object;And
In response to the selection receiving the one to the plurality of object, described computing equipment is caused to perform described action.
17. a non-transitory computer readable storage medium, its storage can be performed one or more sequences of the instruction of one group of operation by one or more processors, and described operation includes:
Process object model to be shown in content page by least one object of multiple objects, at least one object described is associated with attribute, and described attribute specifies the three-dimensional appearance change being applied at least one object described when activating 3-D view pattern on described display screen;
The two dimension view of described content page is shown on the display element of computing equipment;And
In response to activating described 3-D view pattern, it is based at least partially on described attribute by described three-dimensional appearance change application at least one object described.
18. the non-transitory computer readable storage medium as described in clause 17, its instruction also including being performed following operation by the one or more processor:
Being based at least partially on described attribute and present the plurality of object to seem to be stacked on one or more layer, at least the first object of wherein said multiple objects seems to be positioned at least the second object of the plurality of object at a distance above.
19. the non-transitory computer readable storage medium as described in clause 17, its instruction also including being performed following operation by the one or more processor:
Being based at least partially on described attribute and be presented at least side of 3D shape by least one object of the plurality of object, at least one object described is rendered as and appears to be in described 3D shape.
20. the non-transitory computer readable storage medium as described in clause 17, its instruction also including being performed following operation by the one or more processor:
At least one sensor using described computing equipment detects the change of at least one in the orientation of described computing equipment or position;And
It is based at least partially on described attribute and presents the three dimensional representation of web map, described web map includes the 3D of content page and represents that the 3D with the second content page represents, uses at least one three-dimensional lines that the described 3D of described content page being represented, the described 3D being connected to described second content page represents.
The various embodiments of the disclosure also can describe in view of following clause:
1. a computing equipment, comprising:
Display screen;
At least one computing equipment processor;And
Memory devices, it includes the instruction allowing described computing equipment to carry out following operation when being performed by least one computing equipment processor described:
Being shown on the described display screen of described computing equipment by the 3-D view of the first webpage, described first webpage includes multiple interface element;
The request from described first web page navigation to the second webpage is received from user;And
In response to receiving described request,
On described display screen, animation shows that at least subset of the plurality of interface element is left from the three-dimensional of the described 3-D view of described first webpage, and
On described display screen, animation display substitutes the three-dimensional appearance of one or more interface elements of described second webpage of the described subset of the plurality of interface element of described first webpage.
2. the computing equipment as described in clause 1, wherein said instruction also causes described computing equipment when executed:
The animation of at least one interface element of the plurality of interface element is presented for described first webpage leaving described display screen at a predetermined velocity at least one direction.
3. the computing equipment as described in clause 1, wherein said instruction also causes described computing equipment when executed:
Present the animation of at least one interface element of the plurality of interface element for described first webpage leaving described display screen at a predetermined velocity at least one direction, present the animation of at least one interface element of the plurality of interface element for described second webpage entering described display screen at a predetermined velocity at least one direction.
4. the computing equipment as described in clause 3, wherein animation shows that the described at least subset of the plurality of interface element is left the described three-dimensional appearance with the one or more interface element of described second webpage from the described three-dimensional of described first webpage and is configured to when presenting described second webpage to cover the amount of time delay.
5. a computer implemented method, comprising:
Display screen display at computing equipment includes first group of content of one or more object;
Receive the request accessing second group of content;
In response to receiving described request, on described display screen, animation shows leaving of the one or more object of described first group of content;And
Described display screen display at described computing equipment substitutes second group of content of described first group of content.
6. the computer implemented method as described in clause 5, wherein shows that described second group of content also includes:
On described display screen, animation display substitutes the three-dimensional appearance of the one or more object of described second group of content of at least subset of the one or more object of described first group of content.
7. the computer implemented method as described in clause 5, wherein animation shows that the leaving of the one or more object of described first group of content also includes:
Adjusting the outward appearance of the one or more object of described first group of content to cause the one or more object to seem to move to the second position from primary importance, the one or more object that the one or more object being present in described primary importance seems than is present in the described second position is closer to the surface of described display screen.
8. the computer implemented method as described in clause 5, wherein animation shows that the leaving of the one or more object of described first group of content also includes:
Animation shows the separation of the one or more object, and wherein the first object is rendered as and seems to move away from the second object.
9. the computer implemented method as described in clause 8, wherein said first object seems to move away from described second object at a predetermined rate.
10. the computer implemented method as described in clause 8, wherein said one or more objects seem to exit described display screen in a different direction.
11. the computer implemented method as described in clause 6, wherein animation shows that also including occurs in the described three-dimensional of the one or more object of described second group of content:
Present the two-dimensional representation of the described second group of content substituting described first group of content.
12. the computer implemented method as described in clause 11, it also includes:
Showing the three dimensional representation of at least subset of the one or more object, the described three dimensional representation of at least one of wherein said one or more objects is shown as seeming that at least one other object than the one or more object are closer to the surface of described display screen.
13. the computer implemented method as described in clause 5, wherein animation shows that the outward appearance of the one or more object of described second group of content also includes:
Showing the animation being stitched together by the one or more object of described second group of content, at least a part of which the first object is rendered as and seems to be stitched together with at least the second object.
14. the computer implemented method as described in clause 6, the one or more object of wherein said second group of content occurs on the display screen of described computing equipment at different rates.
15. the computer implemented method as described in clause 5,
The animation of at least one interface element of multiple interface element is presented for the described first group of content leaving described display screen at a predetermined velocity at least one direction;And
The animation of at least one interface element of the plurality of interface element is presented for the described second group of content entering described display screen at a predetermined velocity at least one direction.
16. the computer implemented method as described in clause 6, it also includes:
Determine the network connection speed of described computing equipment;
It is based at least partially on described network connection speed and determines the amount of the time being loaded on described computing equipment by described second group of content;And
The described appearance of the described second group of content occurred in the time that animation is shown at least described determined amount,
What wherein animation showed described first group of content leaves the described amount occurring being configured to cover time delay when presenting described second group of content with described second group of content.
17. a non-transitory computer readable storage medium, its storage is performed one or more sequences of the instruction of one group of operation by one or more processors, and described operation includes:
On the display element of computing equipment, display includes first group of content of one or more object;
Receive the request accessing second group of content;
In response to receiving described request, on described display element, animation shows leaving of the one or more object of described first group of content;And
Display screen display at described computing equipment substitutes second group of content of described first group of content.
18. the non-transitory computer readable storage medium as described in clause 17, wherein show that described second group of content also includes being performed the instruction of following operation by the one or more processor:
On described display screen, animation display substitutes the three-dimensional appearance of the one or more object of described second group of content of the one or more object of described first group of content, and wherein animation shows that also including occurs in the described three-dimensional of the one or more object of described second group of content:
Present the two-dimensional representation of the described second group of content substituting described first group of content;And
Presenting the three dimensional representation of at least subset of the one or more object, the described three dimensional representation of at least one of wherein said one or more objects is shown as seeming that at least one other object than the one or more object are closer to the surface of described display screen.
19. the non-transitory computer readable storage medium as described in clause 17, what wherein animation showed the one or more object of described first group of content leaves the instruction also including being performed following operation by the one or more processor:
Adjusting the outward appearance of the one or more object of described first group of content to cause the one or more object to seem to move to the second position from primary importance, the one or more object that the one or more object being present in described primary importance seems than is present in the described second position is closer to the surface of described display screen;And
Animation shows the separation of the one or more object, and wherein the first object is rendered as and seems to move away from the second object, and wherein said first object seems to move away from described second object at a predetermined velocity.
20. the non-transitory computer readable storage medium as described in clause 17, determine for one of the image that directed being changed to of wherein said computing equipment is at least partly based on the video camera by described computing equipment or at least one sensor is caught.
The various embodiments of the disclosure also can describe in view of following clause:
1. a computing equipment, comprising:
Display screen;
At least one computing equipment processor;
Memory devices, it includes the instruction allowing described computing equipment to carry out following operation when being performed by least one computing equipment processor described:
Determine user's relative position relative to described computing equipment;
Being shown on the described display screen of described computing equipment by the two-dimensional representation of webpage, described webpage includes multiple interface element;
Detect the directed change of the described computing equipment described relative position relative to described user;
Directed described change in response to described computing equipment activates the 3-D view pattern for described webpage;And
Show the three dimensional representation of at least subset of the plurality of interface element, and described webpage is displayed in 3-D view pattern, the described three dimensional representation of at least one of wherein said multiple interface elements is shown as seeming that at least one other interface element than the plurality of interface element are closer to the surface of described display screen.
2. the computing equipment as described in clause 1, wherein said instruction also causes described computing equipment when executed:
Detect the second directed change of described computing equipment;And
Adjust the outward appearance of at least one interface element,
The outward appearance wherein adjusting at least one interface element described includes at least one in adjusting the described size of at least one interface element, shape, color, tone or obscuring.
3. the computing equipment as described in clause 1, determines for one of the image that directed being changed to of wherein said computing equipment is at least partly based on the video camera by described computing equipment or at least one sensor is caught.
4. a computer implemented method, comprising:
Being shown on the display element of computing equipment by the two-dimensional representation of content, described content includes multiple object;
Detection activation event;And
In response to detecting that described activation event shows the three dimensional representation of at least subset of the plurality of object.
5. the computer implemented method as described in clause 4, wherein shows that described three dimensional representation also includes:
Present at least one object of the plurality of object to seem that at least one other object than the plurality of object are closer to the surface of the display screen of described computing equipment, wherein said multiple object includes at least one in image, advertisement, text, at least one link or title
Wherein show that the described 3D of at least subset of the plurality of object represents that the height including being based at least partially on the weight being associated with corresponding object presents each object of the described subset of the plurality of object.
6. the computer implemented method as described in clause 4, wherein said activation event includes orientation or at least one change of position, touch input, phonetic entry or the gesture input of described computing equipment.
7. the computer implemented method as described in clause 4, wherein show that the three dimensional representation of at least one object is based at least partially in user profiles or page profile, wherein said user profiles includes the web-page histories of instruction user or the information of at least one of user preference, and described page profile includes the information that instruction default three-dimensional represents.
8. the computer implemented method as described in clause 5, wherein shows that the three dimensional representation of at least one object also includes:
Presenting the plurality of object to seem to be stacked on one or more layer, at least the first object of wherein said multiple objects seems to be positioned at least the second object of the plurality of object at a distance above.
9. the computer implemented method as described in clause 4, wherein shows that described three dimensional representation also includes:
Being presented at least side of 3D shape by least one object of the plurality of object, at least one object described is rendered as and appears to be in described 3D shape.
10. the computer implemented method as described in clause 9, it also includes:
At least one sensor using described computing equipment detects the change of at least one in the orientation of described computing equipment or position;And
Adjust and present described at least one object described so that at least one object described seems that the second side to described 3D shape is moved in the first side from described 3D shape.
11. the computer implemented method as described in clause 4, it also includes:
At least one sensor using described computing equipment detects the change of at least one in the orientation of described computing equipment or position;And
The first group content relevant at least one object of the plurality of object is shown on the first side of at least one object described.
12. the computer implemented method as described in clause 11, it also includes:
Determine the described orientation of described computing equipment or the change subsequently of position;
Animation shows the rotation of at least one object described to show the second side of at least one object described;And
The second group content relevant at least one object described is shown on described second side of at least one object described.
13. the computer implemented method as described in clause 4, it also includes:
Determine the relative position of the user of described computing equipment;And
Present the three dimensional representation of the information being displayed in content page, wherein said information include following at least one: time of the final time indicating described content page to be updated, the display element indicating the safe class of described content page or instruction user access the time of the final time of described content page
Wherein determine that the described relative position of described user includes using at least one video camera of described computing equipment catch image and analyze described image to determine the described relative position of described user.
14. the computer implemented method as described in clause 4, it also includes:
Determine the gaze direction of the user of described computing equipment;
It is based at least partially on the position that the gaze direction of described user is determined on display element;And
Adjust the outward appearance of the object being positioned at described position to seem to be exaggerated relative to other object being displayed on described display element.
15. the computer implemented method as described in clause 4, wherein said content is displayed on the first webpage of multiple webpage that is associated, and described method also includes:
At least one sensor using described computing equipment detects the change of at least one in the orientation of described computing equipment or position;And
Presenting the three dimensional representation of web map, described web map includes the 3D of described first webpage and represents that the 3D with described second webpage represents, uses at least one three-dimensional lines that the described 3D of described first webpage being represented, the described 3D being connected to described second webpage represents.
16. the computer implemented method as described in clause 15, the described 3D of wherein said first webpage represents that being rendered as the described 3D seeming to be positioned at described second webpage represents at a distance above, the amount of the network traffics that the weight of at least one three-dimensional lines wherein said is based at least partially between described first webpage and described second webpage.
17. a non-transitory computer readable storage medium, its storage is performed one or more sequences of the instruction of one group of operation by one or more processors, and described operation includes:
Being shown on the display element of computing equipment by the two-dimensional representation of content, described content includes multiple object;
Detection activation event;And
In response to detecting that described activation event shows the three dimensional representation of at least subset of the plurality of object.
18. the non-transitory computer readable storage medium as described in clause 17, wherein show that described three dimensional representation also includes causing the one or more processor to perform following operation:
Present at least one object of the plurality of object to seem that at least one other object than the plurality of object are closer to the surface of the display screen of described computing equipment,
The described three dimensional representation of at least one of which object is based at least partially in user profiles or page profile, wherein said user profiles includes the web-page histories of instruction user or the information of at least one of user preference, and described page profile includes the information that instruction default three-dimensional represents.
19. the non-transitory computer readable storage medium as described in clause 16, its instruction also including being performed following operation by the one or more processor:
At least one sensor using described computing equipment detects the change of at least one in the orientation of described computing equipment or position;And
Adjusting and present to seem stacking described in the plurality of object, at least the first object of wherein said multiple objects seems to be positioned at least the second object of the plurality of object at a distance above.
20. the non-transitory computer readable storage medium as described in clause 16, its instruction also including being performed following operation by the one or more processor:
Being presented at least side of 3D shape by least one object of the plurality of object, at least one object described is rendered as and appears to be in described 3D shape;
At least one sensor using described computing equipment detects the change of at least one in the orientation of described computing equipment or position;And
Adjust and present described at least one object described so that at least one object described seems that the second side to described 3D shape is moved in the first side from described 3D shape.

Claims (14)

1. a computing equipment, comprising:
Display screen;
At least one computing equipment processor;And
Memory devices, it includes the instruction allowing described computing equipment to carry out following operation when being performed by least one computing equipment processor described:
Receive web displaying request on the described display screen of described computing equipment from user;
Generate DOM Document Object Model, described DOM Document Object Model comprises the information being used to be shown in by least one object on described webpage, at least one object described is associated with label, and the three-dimensional appearance change being applied at least one object described when activating 3-D view pattern on described display screen specified by described label;
Show that the two-dimensional representation of described webpage, described webpage include at least one object described;
Detect the directed change of the described computing equipment described user relative to described computing equipment;
Described change in response to the described orientation of described computing equipment activates described 3-D view pattern;And
By described three-dimensional appearance change application at least one object described in specified by described label.
2. computing equipment as claimed in claim 1, the described three-dimensional appearance wherein specified by described label change includes at least one in the change of size, shape, shadowed effect, focus, painted, position or animation.
3. computing equipment as claimed in claim 1, wherein said DOM Document Object Model also includes the one or more attributes specifying the described three-dimensional appearance change being applied at least one object described.
4. a computing equipment, comprising:
Display screen;
At least one computing equipment processor;And
Memory devices, it includes the instruction allowing described computing equipment to carry out following operation when being performed by least one computing equipment processor described:
Being shown on the described display screen of described computing equipment by the 3-D view of the first webpage, described first webpage includes multiple interface element;
The request from described first web page navigation to the second webpage is received from user;And
In response to receiving described request,
On described display screen, animation shows that at least subset of the plurality of interface element is left from the three-dimensional of the described 3-D view of described first webpage, and
On described display screen, animation display substitutes the three-dimensional appearance of one or more interface elements of described second webpage of the described subset of the plurality of interface element of described first webpage.
5. computing equipment as claimed in claim 4, wherein said instruction also causes described computing equipment when executed:
The animation of at least one interface element of the plurality of interface element is presented for described first webpage leaving described display screen at a predetermined velocity at least one direction.
6. computing equipment as claimed in claim 4, wherein said instruction also causes described computing equipment when executed:
Present the animation of at least one interface element of the plurality of interface element for described first webpage leaving described display screen at a predetermined velocity at least one direction, present the animation of at least one interface element of the plurality of interface element for described second webpage entering described display screen at a predetermined velocity at least one direction.
7. computing equipment as claimed in claim 6, wherein animation shows that the described at least subset of the plurality of interface element is left the described three-dimensional with the one or more interface element of described second webpage from the described three-dimensional of described first webpage and occurred being configured to when presenting described second webpage and cover the amount of time delay.
8. a computer implemented method, comprising:
Process object model to be shown in content page by least one object of multiple objects, at least one object described is associated with attribute, and described attribute specifies the three-dimensional appearance change being applied at least one object described when activating 3-D view pattern on described display screen;
The two dimension view of described content page is shown on the display element of computing equipment;And
In response to activating described 3-D view pattern, it is based at least partially on described attribute by described three-dimensional appearance change application at least one object described.
9. computer implemented method as claimed in claim 8, it also includes:
Detect the directed change of the described computing equipment relative position relative to the user of described computing equipment;
Described change in response to the described orientation of described computing equipment activates the described 3-D view pattern for described content page;And
Be based at least partially on the three dimensional representation of at least one object described in described attribute display when showing described content page with 3-D view pattern, the described three dimensional representation of at least one object wherein said is shown as seeming than at least one other object surface closer to described display screen.
10. computer implemented method as claimed in claim 8, wherein said attribute is based at least partially in user profiles or page profile, wherein said user profiles includes the web-page histories of instruction user or the information of at least one of user preference, and described page profile includes the information that instruction default three-dimensional represents.
11. computer implemented method as claimed in claim 8, it also includes:
Being based at least partially on described attribute and present described content page to seem to be stacked on one or more layer, at least the first object of wherein said multiple objects seems to be positioned at least the second object of the plurality of object at a distance above.
12. computer implemented method as claimed in claim 8, it also includes:
Being based at least partially on described attribute and be presented at least side of 3D shape by least one object of the plurality of object, at least one object described is rendered as and appears to be in described 3D shape.
13. computer implemented method as claimed in claim 12, it also includes:
At least one sensor using described computing equipment detects the change of at least one in the orientation of described computing equipment or position;And
It is based at least partially on described at least one object described in described Attribute tuning and presents so that at least one object described seems that the second side to described 3D shape is moved in the first side from described 3D shape.
15. computer implemented method as claimed in claim 8, it also includes:
Determine the gaze direction of the user of described computing equipment;
It is based at least partially on the position that the gaze direction of described user is determined on display element;And
Be based at least partially on described Attribute tuning be positioned at described position object outward appearance in case seem relative to other object being displayed on described display element be exaggerated.
CN201480051260.XA 2013-09-17 2014-09-16 Approaches for three-dimensional object display Pending CN105814532A (en)

Applications Claiming Priority (7)

Application Number Priority Date Filing Date Title
US14/029,736 US10592064B2 (en) 2013-09-17 2013-09-17 Approaches for three-dimensional object display used in content navigation
US14/029,747 2013-09-17
US14/029,747 US20150082145A1 (en) 2013-09-17 2013-09-17 Approaches for three-dimensional object display
US14/029,736 2013-09-17
US14/029,756 2013-09-17
US14/029,756 US10067634B2 (en) 2013-09-17 2013-09-17 Approaches for three-dimensional object display
PCT/US2014/055878 WO2015042048A2 (en) 2013-09-17 2014-09-16 Approaches for three-dimensional object display

Publications (1)

Publication Number Publication Date
CN105814532A true CN105814532A (en) 2016-07-27

Family

ID=52689586

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201480051260.XA Pending CN105814532A (en) 2013-09-17 2014-09-16 Approaches for three-dimensional object display

Country Status (5)

Country Link
EP (2) EP3623924A1 (en)
JP (2) JP6201058B2 (en)
CN (1) CN105814532A (en)
CA (1) CA2924496A1 (en)
WO (1) WO2015042048A2 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106529409A (en) * 2016-10-10 2017-03-22 中山大学 Eye ocular fixation visual angle measuring method based on head posture
CN106774821A (en) * 2016-11-08 2017-05-31 广州视源电子科技股份有限公司 display method and system based on virtual reality technology
WO2018090871A1 (en) * 2016-11-15 2018-05-24 腾讯科技(深圳)有限公司 Application service indication method and application service indication device
CN108304590A (en) * 2018-03-09 2018-07-20 百度在线网络技术(北京)有限公司 Web page display method, apparatus, equipment and the computer-readable medium of virtual reality
US10067634B2 (en) 2013-09-17 2018-09-04 Amazon Technologies, Inc. Approaches for three-dimensional object display
CN109643212A (en) * 2016-09-20 2019-04-16 苹果公司 3D document editing system
CN109643219A (en) * 2016-09-01 2019-04-16 大众汽车有限公司 Method for being interacted with the picture material presented in display equipment in the car
CN109901710A (en) * 2016-10-19 2019-06-18 腾讯科技(深圳)有限公司 Treating method and apparatus, storage medium and the terminal of media file
US10592064B2 (en) 2013-09-17 2020-03-17 Amazon Technologies, Inc. Approaches for three-dimensional object display used in content navigation

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2711468C2 (en) * 2015-04-01 2020-01-17 Конинклейке Филипс Н.В. Electronic mobile device
JP6723895B2 (en) 2016-10-25 2020-07-15 任天堂株式会社 Information processing program, information processing apparatus, information processing system, and information processing method
US10083640B2 (en) * 2016-12-29 2018-09-25 Pure Depth Limited Multi-layer display including proximity sensor and depth-changing interface elements, and/or associated methods
EP3435250A1 (en) * 2017-07-27 2019-01-30 Vestel Elektronik Sanayi ve Ticaret A.S. Method, apparatus and computer program for overlaying a web page on a 3d object
JP6584459B2 (en) * 2017-07-31 2019-10-02 ヤフー株式会社 Information display program, information display device, information display method, and distribution device
JP2019053387A (en) * 2017-09-13 2019-04-04 パイオニア株式会社 Operation input system, operation input control method, and operation input control program
JP6900133B2 (en) * 2018-01-25 2021-07-07 三菱電機株式会社 Gesture operation device and gesture operation method
JP6901453B2 (en) * 2018-03-20 2021-07-14 ヤフー株式会社 Information display program, information display device, information display method and distribution device
JP6431227B1 (en) * 2018-03-20 2018-11-28 ヤフー株式会社 Information display program, information display device, information display method, and distribution device
JP6987723B2 (en) * 2018-09-12 2022-01-05 ヤフー株式会社 Information display program, information display device, information display method and distribution device
JP7413758B2 (en) * 2019-12-19 2024-01-16 富士フイルムビジネスイノベーション株式会社 Information processing device and program
JP7467748B1 (en) 2023-09-27 2024-04-15 Kddi株式会社 Display control device, display system and program

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0816983A2 (en) * 1996-06-25 1998-01-07 Sun Microsystems, Inc. Method and apparatus for eyetrack-driven text enlargement
CN1505869A (en) * 2001-04-26 2004-06-16 ��˹��ŵ�� Method and apparatus for displaying prioritized icons in a mobile terminal
CN101019422A (en) * 2004-06-30 2007-08-15 皇家飞利浦电子股份有限公司 Method and apparatus for intelligent channel zapping
US20080036776A1 (en) * 2004-04-16 2008-02-14 Apple Inc. User interface for controlling three-dimensional animation of an object
US20100064209A1 (en) * 2008-09-10 2010-03-11 Advanced Digital Broadcast S.A. Method for transforming web page objects
US20120036433A1 (en) * 2010-08-04 2012-02-09 Apple Inc. Three Dimensional User Interface Effects on a Display by Using Properties of Motion
GB2490865A (en) * 2011-05-09 2012-11-21 Nds Ltd User device with gaze tracing to effect control
US20130091462A1 (en) * 2011-10-06 2013-04-11 Amazon Technologies, Inc. Multi-dimensional interface
US20130093764A1 (en) * 2011-10-18 2013-04-18 Research In Motion Limited Method of animating a rearrangement of ui elements on a display screen of an electronic device
US20130106831A1 (en) * 2011-10-28 2013-05-02 Cbs Interactive, Inc. 3-d presentation of information

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0749744A (en) * 1993-08-04 1995-02-21 Pioneer Electron Corp Head mounting type display input device
WO1998014882A1 (en) * 1996-09-30 1998-04-09 Sandcastle, Inc. Synchronization of interactions between objects distributed over a network in the presence of latency
US6161112A (en) * 1998-05-19 2000-12-12 International Business Machines Corporation Web page presentation control mechanism and method
EP1334427A2 (en) * 2000-04-19 2003-08-13 Koninklijke Philips Electronics N.V. Method and apparatus for adapting a graphical user interface
JP2002133263A (en) * 2000-10-23 2002-05-10 Nippon Telegr & Teleph Corp <Ntt> Method and system for presenting advertisement
DE602004011676T2 (en) * 2004-12-02 2009-01-22 Sony Ericsson Mobile Communications Ab Portable communication device with a three-dimensional display device
US8180672B2 (en) * 2006-02-16 2012-05-15 Hillcrest Laboratories, Inc. Systems and methods for placing advertisements
EP1884863A1 (en) * 2006-08-02 2008-02-06 Research In Motion Limited System and method for adjusting presentation of text and images on an electronic device according to an orientation of the device
US8572501B2 (en) * 2007-06-08 2013-10-29 Apple Inc. Rendering graphical objects based on context
US20100045596A1 (en) * 2008-08-21 2010-02-25 Sony Ericsson Mobile Communications Ab Discreet feature highlighting
JP4926207B2 (en) * 2009-05-21 2012-05-09 ヤフー株式会社 Display switching device and method for clearly indicating transition to three-dimensional display
JP5513071B2 (en) * 2009-10-26 2014-06-04 株式会社プロフィールド Information processing apparatus, information processing method, and program
GB2497624B8 (en) * 2010-03-22 2015-01-14 Mobitv Inc Tile based media content selection
CN101930367B (en) * 2010-08-25 2014-04-30 中兴通讯股份有限公司 Implementation method of switching images and mobile terminal
KR101340797B1 (en) * 2010-09-01 2013-12-11 주식회사 팬택 Portable Apparatus and Method for Displaying 3D Object
US9069577B2 (en) * 2010-11-23 2015-06-30 Apple Inc. Grouping and browsing open windows
JP5742187B2 (en) * 2010-11-26 2015-07-01 ソニー株式会社 Information processing apparatus, information processing method, and program

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0816983A2 (en) * 1996-06-25 1998-01-07 Sun Microsystems, Inc. Method and apparatus for eyetrack-driven text enlargement
CN1505869A (en) * 2001-04-26 2004-06-16 ��˹��ŵ�� Method and apparatus for displaying prioritized icons in a mobile terminal
US20080036776A1 (en) * 2004-04-16 2008-02-14 Apple Inc. User interface for controlling three-dimensional animation of an object
CN101019422A (en) * 2004-06-30 2007-08-15 皇家飞利浦电子股份有限公司 Method and apparatus for intelligent channel zapping
US20100064209A1 (en) * 2008-09-10 2010-03-11 Advanced Digital Broadcast S.A. Method for transforming web page objects
US20120036433A1 (en) * 2010-08-04 2012-02-09 Apple Inc. Three Dimensional User Interface Effects on a Display by Using Properties of Motion
GB2490865A (en) * 2011-05-09 2012-11-21 Nds Ltd User device with gaze tracing to effect control
US20130091462A1 (en) * 2011-10-06 2013-04-11 Amazon Technologies, Inc. Multi-dimensional interface
US20130093764A1 (en) * 2011-10-18 2013-04-18 Research In Motion Limited Method of animating a rearrangement of ui elements on a display screen of an electronic device
US20130106831A1 (en) * 2011-10-28 2013-05-02 Cbs Interactive, Inc. 3-d presentation of information

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
C. SINTHANAYOTHIN等: "Interactive virtual 3D gallery using motion detection of mobile device", 《INTERNATIONAL CONFERENCE ON MOBILE IT CONVERGENCE》 *
JESÚS GIMENO 等: "Web3D及Web三维可视化新发展——以WebGL和O3D为例", 《PROCEEDINGS OF THE 2009 INTERNATIONAL CONFERENCE ON IMAGE PROCESSING》 *

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10067634B2 (en) 2013-09-17 2018-09-04 Amazon Technologies, Inc. Approaches for three-dimensional object display
US10592064B2 (en) 2013-09-17 2020-03-17 Amazon Technologies, Inc. Approaches for three-dimensional object display used in content navigation
CN109643219A (en) * 2016-09-01 2019-04-16 大众汽车有限公司 Method for being interacted with the picture material presented in display equipment in the car
CN109643212A (en) * 2016-09-20 2019-04-16 苹果公司 3D document editing system
CN109643212B (en) * 2016-09-20 2022-04-01 苹果公司 3D document editing system
CN106529409A (en) * 2016-10-10 2017-03-22 中山大学 Eye ocular fixation visual angle measuring method based on head posture
CN109901710A (en) * 2016-10-19 2019-06-18 腾讯科技(深圳)有限公司 Treating method and apparatus, storage medium and the terminal of media file
CN106774821A (en) * 2016-11-08 2017-05-31 广州视源电子科技股份有限公司 display method and system based on virtual reality technology
CN106774821B (en) * 2016-11-08 2020-05-19 广州视源电子科技股份有限公司 Display method and system based on virtual reality technology
WO2018090871A1 (en) * 2016-11-15 2018-05-24 腾讯科技(深圳)有限公司 Application service indication method and application service indication device
CN108304590A (en) * 2018-03-09 2018-07-20 百度在线网络技术(北京)有限公司 Web page display method, apparatus, equipment and the computer-readable medium of virtual reality

Also Published As

Publication number Publication date
JP6605000B2 (en) 2019-11-13
CA2924496A1 (en) 2015-03-26
JP2018022503A (en) 2018-02-08
EP3047363A4 (en) 2017-05-17
JP6201058B2 (en) 2017-09-20
EP3047363A2 (en) 2016-07-27
EP3623924A1 (en) 2020-03-18
WO2015042048A3 (en) 2015-05-28
JP2017501500A (en) 2017-01-12
WO2015042048A2 (en) 2015-03-26
EP3047363B1 (en) 2019-11-06

Similar Documents

Publication Publication Date Title
CN105814532A (en) Approaches for three-dimensional object display
US20180348988A1 (en) Approaches for three-dimensional object display
US10592064B2 (en) Approaches for three-dimensional object display used in content navigation
US9880640B2 (en) Multi-dimensional interface
US9591295B2 (en) Approaches for simulating three-dimensional views
US20150082145A1 (en) Approaches for three-dimensional object display
US9013416B2 (en) Multi-display type device interactions
US9104293B1 (en) User interface points of interest approaches for mapping applications
US9094670B1 (en) Model generation and database
US9437038B1 (en) Simulating three-dimensional views using depth relationships among planes of content
US9160993B1 (en) Using projection for visual recognition
US20190333478A1 (en) Adaptive fiducials for image match recognition and tracking
US9389703B1 (en) Virtual screen bezel
US20160034143A1 (en) Navigating digital content by tilt gestures
US9665249B1 (en) Approaches for controlling a computing device based on head movement
US10585485B1 (en) Controlling content zoom level based on user head movement
US9524036B1 (en) Motions for displaying additional content
US9109921B1 (en) Contextual based navigation element
US10867445B1 (en) Content segmentation and navigation
Hemsley Flow: a framework for reality-based interfaces

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
AD01 Patent right deemed abandoned

Effective date of abandoning: 20210409

AD01 Patent right deemed abandoned