US20170308272A1 - Virtual reality applications - Google Patents

Virtual reality applications Download PDF

Info

Publication number
US20170308272A1
US20170308272A1 US15/633,636 US201715633636A US2017308272A1 US 20170308272 A1 US20170308272 A1 US 20170308272A1 US 201715633636 A US201715633636 A US 201715633636A US 2017308272 A1 US2017308272 A1 US 2017308272A1
Authority
US
United States
Prior art keywords
applications
scene
detected
component
contextual information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/633,636
Inventor
Roy Levien
Mark Malamud
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Empire Technology Development LLC
Original Assignee
Empire Technology Development LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Empire Technology Development LLC filed Critical Empire Technology Development LLC
Priority to US15/633,636 priority Critical patent/US20170308272A1/en
Publication of US20170308272A1 publication Critical patent/US20170308272A1/en
Assigned to CRESTLINE DIRECT FINANCE, L.P. reassignment CRESTLINE DIRECT FINANCE, L.P. SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: EMPIRE TECHNOLOGY DEVELOPMENT LLC
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/60Software deployment
    • G06F8/61Installation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus

Definitions

  • the number of mobile computing devices in use has increased dramatically over the last decade and continues to increase. Examples of mobile computing devices are mobile telephones, digital cameras, and global positioning system (“GPS”) receivers. According to one study, 60% of the world's population has access to mobile telephones. An increasing number of people use digital cameras and some manufacturers of digital cameras presently have revenues of tens of billions of United States dollars annually. Digital cameras are used to capture, store, and share images. Often, the images can be viewed nearly immediately after they are captured, such as on a display device associated with the digital cameras. Once an image is captured, it can be processed by computing devices. Image recognition is one such process that can be used to recognize and identify objects in an image. For example, image recognition techniques can determine whether an image contains a human face, a particular object or shape, etc.
  • Augmented reality is a view of a physical, real-world environment that is enhanced by computing devices to digitally augment visual or auditory information a user observes in the real world.
  • an augmented reality system can receive scene information from a digital camera and a GPS, identify objects (e.g., people, animals, structures, etc.) in the scene, and provide additional information relating to the identified objects.
  • a user of such a system can take a photo of a scene using a mobile computing device (e.g., a digital camera, a cellular phone, a “smartphone,” etc.) and automatically receive information about one or more objects an augmented reality system recognizes in the photographed (i.e., digitized) scene.
  • a mobile computing device e.g., a digital camera, a cellular phone, a “smartphone,” etc.
  • FIG. 1 is a flow diagram illustrating a routine invoked by the disclosed technology in various embodiments.
  • FIG. 2 is an environmental diagram illustrating use of the disclosed technology in various embodiments.
  • FIG. 3 is a block diagram illustrating components employed by the disclosed technology in various embodiments.
  • FIG. 4 is a flow diagram illustrating a routine invoked by the disclosed technology in various embodiments.
  • FIG. 5 is a flow diagram illustrating a routine invoked by the disclosed technology in various embodiments.
  • FIGS. 6A and 6B are environmental diagrams illustrating use of the disclosed technology in various embodiments.
  • FIG. 7 is a flow diagram illustrating a routine invoked by the disclosed technology in various embodiments.
  • FIG. 8 is a block diagram of an illustrative embodiment of a computing device that is arranged in accordance with at least some embodiments of the present disclosure.
  • the technology detects objects in a scene, identifies one or more installed applications based on at least one detected object, and displays an icon representing the identified one or more applications, e.g., proximate to the detected object(s) in a display of the scene.
  • the technology can use various techniques for object recognition, e.g., image recognition, pattern recognition, etc.
  • an application corresponding to the selected icon can start.
  • the technology can instead (or additionally) identify available but not-yet-installed applications based on at least one detected object.
  • an application corresponding to the selected icon can be installed and optionally started.
  • the technology enables a user to quickly identify applications that may be pertinent to the context or milieu in which users find themselves.
  • the technology can employ a digital camera configured for use with a mobile device the user is employing to digitize a scene.
  • the mobile device can also process contextual information, e.g., GPS coordinates.
  • Some applications may correspond to contextual information.
  • an identified application can be an application corresponding to the restaurant.
  • the technology may identify applications suitable for the current location (e.g., GPS coordinates).
  • a user can specify the contextual information to use.
  • the technology may identify applications for establishments that are open at the current time, but the user may be interested only in applications corresponding to establishments that are open for dinner later in the day.
  • the technology can identify applications, e.g., by matching attributes corresponding to installed (or available) applications to the present context or milieu, e.g., based on attributes of matched objects.
  • the attributes can be stored locally on a mobile device or at a server.
  • a user can also associate or disassociate an application with recognized objects, e.g., so that a particular application's icon is visible or removed the next time an object is in a digitized scene.
  • the technology can also identify applications based on stored application “preferences.” These preferences may be indicated by application developers, organizations, etc. As an example, when a user is in a particular geographic area, a “sponsored application” may be identified by the technology.
  • the technology may use various techniques to alter the user interface.
  • the icons may be stacked; some icons may appear before other applications in the stack; some icons (e.g., sponsored applications) may be larger than other icons; etc.
  • the technology can also adapt the icons for applications, e.g., so that the icons are representative of underlying information.
  • a restaurant review application's icon may be identified for many restaurants, and the icon may change to indicate a review for the recognized restaurant.
  • the technology can detect objects in a scene, associate the detected objects with methods for interacting with the detected objects, obtain a specification for interacting with the detected objects using the associated methods; and provide a user interface for controlling the detected objects.
  • an audiovisual device e.g., television, DVD player, etc.
  • the technology can communicate with the detected device (e.g., using WiFi, radiofrequency, infrared, or other communications means) and obtain a specification for interacting with the device.
  • the specification can provide information, e.g., available commands, how the commands are to be sent, the format for the commands, etc.
  • the specification can also provide information about user interface elements.
  • the technology can provide a user interface that a user can use to control the device.
  • the technology can transmit commands to the device.
  • the technology can communicate with the detected objects by employing a radiofrequency identification tag, wireless network, infrared signal, etc.
  • the technology includes a component configured for use with a device that receives a signal from a computing device, provides an identification of one or more methods operable to control the device, receives a command from the computing device wherein the command was identified in the one or more methods, and controls the device according to the received command.
  • the command can be to control media (e.g., play, stop, pause, rewind, fast forward, etc.), control a power circuit (e.g., turn on/off), etc.
  • the component may also provide a specification for the one or more methods, e.g., a hint for providing a user interface component.
  • FIG. 1 is a flow diagram illustrating a routine 100 invoked by the disclosed technology in various embodiments.
  • the routine 100 begins at block 102 .
  • the routine 100 then continues at block 104 , where it receives a digitized vision of a scene.
  • the routine 100 then continues at block 106 , where it detects objects in the scene.
  • the routine 100 may employ various image recognition techniques to recognize objects.
  • the routine 100 then continues at block 108 where it receives contextual information. Examples of contextual information are location information (GPS coordinates, street address, city, etc.), time of day, etc.
  • the routine 100 then continues at block 110 where it identifies applications based on the detected objects.
  • the technology when the technology recognizes a television, the technology may indicate an application that provides current television listings. As another example, when the technology recognizes a restaurant, the technology may identify an application that is associated with the restaurant, e.g., to provide menus, reserve seats, etc.
  • the routine 100 then continues at block 112 where it identifies objects based on contextual information. As an example, if there are two restaurants identified in the scene and one of the restaurants is only open for lunch and dinner, the technology may only identify the restaurant open for breakfast if the present time is within what would normally be considered breakfast hours.
  • the routine 100 then continues at block 114 , where it places an icon representing identified applications near detected objects. The routine 100 then returns at block 116 .
  • FIG. 2 is an environmental diagram illustrating use of the disclosed technology in various embodiments.
  • a scene 200 includes three objects: a first object 202 , a second object 204 , and a third object 206 .
  • a display of a mobile computing device 208 displays digitized representations of the objects as a digitized representation of the first object 210 , a digitized representation of the second object 212 , and a digitized representation of the third object 214 .
  • the digitized representation of the first object to 10 is associated with a first icon 216 A and a second icon 216 B.
  • the digitized representation of the second object 212 is associated with a third icon 218 .
  • the icons can represent installed applications or available applications.
  • the technology may launch the indicated application (if already installed) or install the indicated application.
  • the technology may automatically launch applications that are installed.
  • FIG. 3 is a block diagram illustrating components employed by the disclosed technology in various embodiments.
  • the components 300 can include a digitizer 302 , a recognizer 304 , application attributes 306 , and an identifier 308 .
  • additional components (not illustrated) or a subset of the illustrated components 300 can be employed without deviating from the scope of the claimed technology.
  • the digitizer component 302 can digitize a scene, e.g., a scene received via an image capture device (not illustrated).
  • the recognizer component 304 can recognize objects in a digitized scene. In various embodiments, the recognizer component can use various image recognition techniques to recognize objects in the digitized scene.
  • the identifier component 308 can identify installed applications to be associated with recognized objects, e.g., using stored application attributes 306 .
  • the attributes can indicate, e.g., information about objects with which applications are associated, time of day, location, etc.
  • the identifier component 308 can also employ a server computing device (not illustrated), e.g., to identify applications that are not presently installed but may be associated with recognized objects.
  • FIG. 4 is a flow diagram illustrating a routine 400 invoked by the disclosed technology in various embodiments.
  • the routine 400 begins at block 402 .
  • the routine 400 continues at block 404 , where it receives a digitized vision of a scene.
  • the routine 400 then continues at block 406 where it detects objects in the digitized scene.
  • the routine 400 can employ various image recognition techniques to recognize objects.
  • the routine 400 then continues at block 408 where it identifies applications that are not stored locally, e.g., on a mobile computing device on which the routine executes.
  • the routine 400 then continues at block 410 where it places an icon for an application near a recognized object.
  • routine 400 may place an icon for an application corresponding to the recognized coffee shop that, if selected, causes the application to be installed.
  • the routine 400 then returns at block 412 .
  • FIG. 5 is a flow diagram illustrating a routine 500 invoked by the disclosed technology in various embodiments.
  • the routine 500 begins at block 502 .
  • the routine then continues at block 504 , where it detects objects in a scene.
  • the routine 500 then continues at block 506 , where it selects a first object from the recognized objects.
  • the routine 500 then continues at decision block 508 , where it determines whether the selected object can be interacted with.
  • the routine 500 may make this determination, e.g., by querying a database (e.g., a local database stored at the computing device invoking the routine 500 or a remote database stored at a server computing device). If the selected object can be interacted with, the routine 500 continues at block 510 . Otherwise, the routine 500 continues at block 514 .
  • a database e.g., a local database stored at the computing device invoking the routine 500 or a remote database stored at a server computing device.
  • the routine 500 obtains specifications for interacting with the selected object.
  • the routine 500 may obtain the specifications from a locally stored database, the object directly (e.g., wirelessly), or from a remote computing device.
  • the routine 500 may then provide a user interface to a user so that the user can interact with the selected object.
  • the recognized object is a television or other audiovisual device
  • the routine 500 may provide a user interface that enables the user to control the audiovisual device.
  • the received specifications can include instructions for providing aspects of the user interface.
  • the routine 500 then continues at block 514 , where it selects a next object from the set of objects detected above in relation to block 504 .
  • the routine 500 then continues at decision block 516 , where it determines whether a next object was selected. If there are no more objects to be selected, the routine 500 returns at block 518 . Otherwise, the routine 500 continues at decision block 508 to analyze the selected object.
  • FIGS. 6A and 6B are environmental diagrams illustrating use of the disclosed technology in various embodiments.
  • FIG. 6A includes a scene 600 and a digitized version of the scene 600 displayed at a mobile computing device 606 .
  • the scene 600 includes a television 602 and another object 604 .
  • the digitized version of the scene 600 displayed at the mobile computing device 606 includes a digitized representation of the television 608 and a digitized representation of the other object 610 .
  • the mobile computing device 606 also displays an icon 612 associated with the digitized representation of the television 608 .
  • the technology may have recognized the television 602 and identified an application corresponding to the television 602 and represented by the icon 612 .
  • the technology may launch the corresponding application (or install the corresponding application).
  • the technology may employ an antenna 614 associated with the mobile computing device 606 , e.g., to communicate with the television 602 or a network computing device (not illustrated) to receive specifications relating to controlling the television 602 .
  • the mobile computing device 606 may communicate with the television using infrared, radio frequency, WiFi, etc.
  • FIG. 6B illustrates a user interface 620 displayed by the mobile computing device 606 , e.g., when the user launches the application by selecting icon 612 .
  • FIG. 7 is a flow diagram illustrating a routine 700 invoked by the disclosed technology in various embodiments.
  • the routine 700 begins at block 702 .
  • the routine 700 then continues at block 704 , where it receives a signal.
  • the routine 700 can receive a signal from a mobile computing device that a user is operating to command a device on which the routine 700 executes.
  • the routine 700 then continues at block 706 , work provides methods operable to control the device.
  • the routine 700 may provide a specification for controlling the device. The specification can include indications of user interfaces, available commands, frequencies, etc.
  • the routine 700 then continues at block 708 , where it receives a command.
  • the routine may receive commands from the mobile computing device to which the routine 700 provided the specification.
  • the routine 700 then continues at block 710 , where it controls the device according to the received command.
  • the routine then returns at block 712 .
  • FIG. 8 is a block diagram illustrating an example computing device 800 that is arranged in accordance with at least some embodiments of the present disclosure.
  • computing device 800 typically includes one or more processors 804 and a system memory 806 .
  • a memory bus 808 may be used for communicating between processor 804 and system memory 806 .
  • processor 804 may be of any type including but not limited to a microprocessor (“ ⁇ P”), a microcontroller (“ ⁇ C”), a digital signal processor (“DSP”), or any combination thereof.
  • Processor 804 may include one or more levels of caching, such as a level one cache 810 and a level two cache 812 , a processor core 814 , and registers 816 .
  • An example processor core 814 may include an arithmetic logic unit (“ALU”), a floating point unit (“FPU”), a digital signal processing core (“DSP core”), or any combination thereof.
  • An example memory controller 818 may also be used with processor 804 , or in some implementations memory controller 818 may be an internal part of processor 804 .
  • system memory 806 may be of any type including but not limited to volatile memory (such as RAM), non-volatile memory (such as ROM, flash memory, etc.) or any combination thereof.
  • System memory 806 may include an operating system 820 , one or more applications 822 , and program data 824 .
  • Application 822 may include an application identifier component 826 that is arranged to identify applications corresponding to a recognized object.
  • Program data 824 may include application attribute information 828 , as is described herein.
  • application 822 may be arranged to operate with program data 824 on operating system 820 such that applications can be identified.
  • This described basic configuration 802 is illustrated in FIG. 8 by those components within the inner dashed line.
  • Computing device 800 may have additional features or functionality, and additional interfaces to facilitate communications between basic configuration 802 and any required devices and interfaces.
  • a bus/interface controller 830 may be used to facilitate communications between basic configuration 802 and one or more data storage devices 832 via a storage interface bus 834 .
  • Data storage devices 832 may be removable storage devices 836 , non-removable storage devices 838 , or a combination thereof. Examples of removable storage and non-removable storage devices include magnetic disk devices such as flexible disk drives and hard-disk drives (“HDDs”), optical disk drives such as compact disk (“CD”) drives or digital versatile disk (“DVD”) drives, solid state drives (“SSDs”), and tape drives to name a few.
  • Example computer storage media may include volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data.
  • Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVDs) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which may be used to store the desired information and which may be accessed by computing device 800 . Any such computer storage media may be part of computing device 800 .
  • Computing device 800 may also include an interface bus 840 for facilitating communication from various interface devices (e.g., output devices 842 , peripheral interfaces 844 , and communication devices 846 ) to basic configuration 802 via bus/interface controller 830 .
  • Example output devices 842 include a graphics processing unit 848 and an audio processing unit 850 , which may be configured to communicate to various external devices such as a display or speakers via one or more A/V ports 852 .
  • Example peripheral interfaces 844 include a serial interface controller 854 or a parallel interface controller 856 , which may be configured to communicate with external devices such as input devices (e.g., keyboard, mouse, pen, voice input device, touch input device, etc.) or other peripheral devices (e.g., printer, scanner, etc.) via one or more I/O ports 858 .
  • An example communication device 846 includes a network controller 860 , which may be arranged to facilitate communications with one or more other computing devices 862 over a network communication link via one or more communication ports 864 .
  • the network communication link may be one example of a communication media.
  • Communication media may typically be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and may include any information delivery media.
  • a “modulated data signal” may be a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
  • communication media may include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency (“RF”), microwave, infrared (“IR”) and other wireless media.
  • RF radio frequency
  • IR infrared
  • the term computer readable media as used herein may include both storage media and communication media.
  • Computing device 800 may be implemented as a portion of a small-form factor portable (or mobile) electronic device such as a cell phone, a personal data assistant (“PDA”), a personal media player device, a wireless web-watch device, a personal headset device, an application specific device, or a hybrid device that include any of the above functions.
  • a small-form factor portable (or mobile) electronic device such as a cell phone, a personal data assistant (“PDA”), a personal media player device, a wireless web-watch device, a personal headset device, an application specific device, or a hybrid device that include any of the above functions.
  • PDA personal data assistant
  • Computing device 800 may also be implemented as a personal computer including both laptop computer and non-laptop computer configurations.

Abstract

Augmented reality technology is described. The technology can detect objects in a scene, identifying one or more installed or available applications based on the detected objects, and place icons representing the identified applications proximate to the detected objects in a display of the scene, e.g., so that a user can start or install the identified applications. The technology can also facilitate interaction with an identified object, e.g., to remotely control a recognized object.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a divisional application under 35 U.S.C. §121 of and claims priority under 35 U.S.C. §120 to U.S. application Ser. No. 13/821,560, filed on Mar. 7, 2013, entitled “VIRTUAL REALITY APPLICATIONS,” which in turn is a U.S National Stage filing under 35 U.S.C. §371 of the International Application No. PCT/US2012/052320, filed on Aug. 24, 2012 and entitled “VIRTUAL REALITY APPLICATIONS.” U.S. application Ser. No. 13/821,560 and International Application No. PCT/US12/52320, including any appendices or attachments thereof, are hereby incorporated by reference in their entirety.
  • BACKGROUND
  • The number of mobile computing devices in use has increased dramatically over the last decade and continues to increase. Examples of mobile computing devices are mobile telephones, digital cameras, and global positioning system (“GPS”) receivers. According to one study, 60% of the world's population has access to mobile telephones. An increasing number of people use digital cameras and some manufacturers of digital cameras presently have revenues of tens of billions of United States dollars annually. Digital cameras are used to capture, store, and share images. Often, the images can be viewed nearly immediately after they are captured, such as on a display device associated with the digital cameras. Once an image is captured, it can be processed by computing devices. Image recognition is one such process that can be used to recognize and identify objects in an image. For example, image recognition techniques can determine whether an image contains a human face, a particular object or shape, etc.
  • Augmented reality is a view of a physical, real-world environment that is enhanced by computing devices to digitally augment visual or auditory information a user observes in the real world. As an example, an augmented reality system can receive scene information from a digital camera and a GPS, identify objects (e.g., people, animals, structures, etc.) in the scene, and provide additional information relating to the identified objects. A user of such a system can take a photo of a scene using a mobile computing device (e.g., a digital camera, a cellular phone, a “smartphone,” etc.) and automatically receive information about one or more objects an augmented reality system recognizes in the photographed (i.e., digitized) scene.
  • There are now hundreds of thousands of applications available for mobile devices. Users can download and install applications (“apps”) that are interesting or useful to them. However, finding such applications can be challenging.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a flow diagram illustrating a routine invoked by the disclosed technology in various embodiments.
  • FIG. 2 is an environmental diagram illustrating use of the disclosed technology in various embodiments.
  • FIG. 3 is a block diagram illustrating components employed by the disclosed technology in various embodiments.
  • FIG. 4 is a flow diagram illustrating a routine invoked by the disclosed technology in various embodiments.
  • FIG. 5 is a flow diagram illustrating a routine invoked by the disclosed technology in various embodiments.
  • FIGS. 6A and 6B are environmental diagrams illustrating use of the disclosed technology in various embodiments.
  • FIG. 7 is a flow diagram illustrating a routine invoked by the disclosed technology in various embodiments.
  • FIG. 8 is a block diagram of an illustrative embodiment of a computing device that is arranged in accordance with at least some embodiments of the present disclosure.
  • DETAILED DESCRIPTION
  • In the following detailed description, reference is made to the accompanying drawings, which form a part hereof. In the drawings, similar symbols typically identify similar components, unless context dictates otherwise. The illustrative embodiments described in the detailed description, drawings, and claims are not meant to be limiting. Other embodiments may be utilized, and other changes may be made, without departing from the scope of the subject matter presented herein. It will be readily understood that the aspects of the present disclosure, as generally described herein, and illustrated in the Figures, can be arranged, substituted, combined, separated, and designed in a wide variety of different configurations, all of which are explicitly contemplated herein.
  • Augmented reality technology (“the technology”) is described. In various embodiments, the technology detects objects in a scene, identifies one or more installed applications based on at least one detected object, and displays an icon representing the identified one or more applications, e.g., proximate to the detected object(s) in a display of the scene. The technology can use various techniques for object recognition, e.g., image recognition, pattern recognition, etc. When a user selects a displayed icon, an application corresponding to the selected icon can start. In various embodiments, the technology can instead (or additionally) identify available but not-yet-installed applications based on at least one detected object. When the user selects a displayed icon, an application corresponding to the selected icon can be installed and optionally started. Thus, the technology enables a user to quickly identify applications that may be pertinent to the context or milieu in which users find themselves.
  • The technology can employ a digital camera configured for use with a mobile device the user is employing to digitize a scene. The mobile device can also process contextual information, e.g., GPS coordinates. Some applications may correspond to contextual information. As an example, when the scene includes a particular restaurant, an identified application can be an application corresponding to the restaurant. If multiple applications correspond to an object (e.g., a restaurant, store, or other establishment or object), the technology may identify applications suitable for the current location (e.g., GPS coordinates). Alternatively, a user can specify the contextual information to use. As an example, the technology may identify applications for establishments that are open at the current time, but the user may be interested only in applications corresponding to establishments that are open for dinner later in the day.
  • The technology can identify applications, e.g., by matching attributes corresponding to installed (or available) applications to the present context or milieu, e.g., based on attributes of matched objects. The attributes can be stored locally on a mobile device or at a server. A user can also associate or disassociate an application with recognized objects, e.g., so that a particular application's icon is visible or removed the next time an object is in a digitized scene.
  • In various embodiments, the technology can also identify applications based on stored application “preferences.” These preferences may be indicated by application developers, organizations, etc. As an example, when a user is in a particular geographic area, a “sponsored application” may be identified by the technology.
  • When multiple application icons are identified, the technology may use various techniques to alter the user interface. As examples, the icons may be stacked; some icons may appear before other applications in the stack; some icons (e.g., sponsored applications) may be larger than other icons; etc.
  • The technology can also adapt the icons for applications, e.g., so that the icons are representative of underlying information. As an example, a restaurant review application's icon may be identified for many restaurants, and the icon may change to indicate a review for the recognized restaurant.
  • In various embodiments, the technology can detect objects in a scene, associate the detected objects with methods for interacting with the detected objects, obtain a specification for interacting with the detected objects using the associated methods; and provide a user interface for controlling the detected objects. As an example, when an audiovisual device (e.g., television, DVD player, etc.) is detected in a scene, the technology can communicate with the detected device (e.g., using WiFi, radiofrequency, infrared, or other communications means) and obtain a specification for interacting with the device. The specification can provide information, e.g., available commands, how the commands are to be sent, the format for the commands, etc. The specification can also provide information about user interface elements. Upon receiving the specification, the technology can provide a user interface that a user can use to control the device. When the user interacts via the user interface, the technology can transmit commands to the device. In various embodiments, the technology can communicate with the detected objects by employing a radiofrequency identification tag, wireless network, infrared signal, etc.
  • In various embodiments, the technology includes a component configured for use with a device that receives a signal from a computing device, provides an identification of one or more methods operable to control the device, receives a command from the computing device wherein the command was identified in the one or more methods, and controls the device according to the received command. The command can be to control media (e.g., play, stop, pause, rewind, fast forward, etc.), control a power circuit (e.g., turn on/off), etc. The component may also provide a specification for the one or more methods, e.g., a hint for providing a user interface component.
  • Turning now to the figures, FIG. 1 is a flow diagram illustrating a routine 100 invoked by the disclosed technology in various embodiments. The routine 100 begins at block 102. The routine 100 then continues at block 104, where it receives a digitized vision of a scene. The routine 100 then continues at block 106, where it detects objects in the scene. In various embodiments, the routine 100 may employ various image recognition techniques to recognize objects. The routine 100 then continues at block 108 where it receives contextual information. Examples of contextual information are location information (GPS coordinates, street address, city, etc.), time of day, etc. The routine 100 then continues at block 110 where it identifies applications based on the detected objects. As an example, when the technology recognizes a television, the technology may indicate an application that provides current television listings. As another example, when the technology recognizes a restaurant, the technology may identify an application that is associated with the restaurant, e.g., to provide menus, reserve seats, etc. The routine 100 then continues at block 112 where it identifies objects based on contextual information. As an example, if there are two restaurants identified in the scene and one of the restaurants is only open for lunch and dinner, the technology may only identify the restaurant open for breakfast if the present time is within what would normally be considered breakfast hours. The routine 100 then continues at block 114, where it places an icon representing identified applications near detected objects. The routine 100 then returns at block 116.
  • Those skilled in the art will appreciate that the logic illustrated in FIG. 1 and described above, and in each of the flow diagrams discussed below, may be altered in a variety of ways. For example, the order of the logic may be rearranged, substeps may be performed in parallel, illustrated logic may be omitted, other logic may be included, etc.
  • FIG. 2 is an environmental diagram illustrating use of the disclosed technology in various embodiments. A scene 200 includes three objects: a first object 202, a second object 204, and a third object 206. A display of a mobile computing device 208 displays digitized representations of the objects as a digitized representation of the first object 210, a digitized representation of the second object 212, and a digitized representation of the third object 214. The digitized representation of the first object to 10 is associated with a first icon 216A and a second icon 216B. The digitized representation of the second object 212 is associated with a third icon 218. As described above, the icons can represent installed applications or available applications. When a user selects an icon, e.g., by touching an area near the icon on a touchscreen of the mobile computing device, the technology may launch the indicated application (if already installed) or install the indicated application. In some embodiments, the technology may automatically launch applications that are installed.
  • FIG. 3 is a block diagram illustrating components employed by the disclosed technology in various embodiments. The components 300 can include a digitizer 302, a recognizer 304, application attributes 306, and an identifier 308. In various embodiments, additional components (not illustrated) or a subset of the illustrated components 300 can be employed without deviating from the scope of the claimed technology. The digitizer component 302 can digitize a scene, e.g., a scene received via an image capture device (not illustrated). The recognizer component 304 can recognize objects in a digitized scene. In various embodiments, the recognizer component can use various image recognition techniques to recognize objects in the digitized scene. The identifier component 308 can identify installed applications to be associated with recognized objects, e.g., using stored application attributes 306. The attributes can indicate, e.g., information about objects with which applications are associated, time of day, location, etc. The identifier component 308 can also employ a server computing device (not illustrated), e.g., to identify applications that are not presently installed but may be associated with recognized objects.
  • FIG. 4 is a flow diagram illustrating a routine 400 invoked by the disclosed technology in various embodiments. The routine 400 begins at block 402. The routine 400 continues at block 404, where it receives a digitized vision of a scene. The routine 400 then continues at block 406 where it detects objects in the digitized scene. The routine 400 can employ various image recognition techniques to recognize objects. The routine 400 then continues at block 408 where it identifies applications that are not stored locally, e.g., on a mobile computing device on which the routine executes. The routine 400 then continues at block 410 where it places an icon for an application near a recognized object. As an example, if the routine 400 recognizes a coffee shop in a digitized scene and the user's mobile computing device does not have installed an application corresponding to the recognized coffee shop, the routine 400 may place an icon for an application corresponding to the recognized coffee shop that, if selected, causes the application to be installed. The routine 400 then returns at block 412.
  • FIG. 5 is a flow diagram illustrating a routine 500 invoked by the disclosed technology in various embodiments. The routine 500 begins at block 502. The routine then continues at block 504, where it detects objects in a scene. The routine 500 then continues at block 506, where it selects a first object from the recognized objects. The routine 500 then continues at decision block 508, where it determines whether the selected object can be interacted with. The routine 500 may make this determination, e.g., by querying a database (e.g., a local database stored at the computing device invoking the routine 500 or a remote database stored at a server computing device). If the selected object can be interacted with, the routine 500 continues at block 510. Otherwise, the routine 500 continues at block 514. At block 510, the routine 500 obtains specifications for interacting with the selected object. In various embodiments, the routine 500 may obtain the specifications from a locally stored database, the object directly (e.g., wirelessly), or from a remote computing device. The routine 500 may then provide a user interface to a user so that the user can interact with the selected object. As an example, if the recognized object is a television or other audiovisual device, the routine 500 may provide a user interface that enables the user to control the audiovisual device. The received specifications can include instructions for providing aspects of the user interface. The routine 500 then continues at block 514, where it selects a next object from the set of objects detected above in relation to block 504. The routine 500 then continues at decision block 516, where it determines whether a next object was selected. If there are no more objects to be selected, the routine 500 returns at block 518. Otherwise, the routine 500 continues at decision block 508 to analyze the selected object.
  • FIGS. 6A and 6B are environmental diagrams illustrating use of the disclosed technology in various embodiments. FIG. 6A includes a scene 600 and a digitized version of the scene 600 displayed at a mobile computing device 606. The scene 600 includes a television 602 and another object 604. The digitized version of the scene 600 displayed at the mobile computing device 606 includes a digitized representation of the television 608 and a digitized representation of the other object 610. The mobile computing device 606 also displays an icon 612 associated with the digitized representation of the television 608. As an example, the technology may have recognized the television 602 and identified an application corresponding to the television 602 and represented by the icon 612. When the user selects the icon 612, the technology may launch the corresponding application (or install the corresponding application). In various embodiments, the technology may employ an antenna 614 associated with the mobile computing device 606, e.g., to communicate with the television 602 or a network computing device (not illustrated) to receive specifications relating to controlling the television 602. In various embodiments, the mobile computing device 606 may communicate with the television using infrared, radio frequency, WiFi, etc. FIG. 6B illustrates a user interface 620 displayed by the mobile computing device 606, e.g., when the user launches the application by selecting icon 612.
  • FIG. 7 is a flow diagram illustrating a routine 700 invoked by the disclosed technology in various embodiments. The routine 700 begins at block 702. The routine 700 then continues at block 704, where it receives a signal. In various embodiments, the routine 700 can receive a signal from a mobile computing device that a user is operating to command a device on which the routine 700 executes. The routine 700 then continues at block 706, work provides methods operable to control the device. As an example, the routine 700 may provide a specification for controlling the device. The specification can include indications of user interfaces, available commands, frequencies, etc. The routine 700 then continues at block 708, where it receives a command. In various embodiments, the routine may receive commands from the mobile computing device to which the routine 700 provided the specification. The routine 700 then continues at block 710, where it controls the device according to the received command. The routine then returns at block 712.
  • FIG. 8 is a block diagram illustrating an example computing device 800 that is arranged in accordance with at least some embodiments of the present disclosure. In a very basic configuration 802, computing device 800 typically includes one or more processors 804 and a system memory 806. A memory bus 808 may be used for communicating between processor 804 and system memory 806.
  • Depending on the desired configuration, processor 804 may be of any type including but not limited to a microprocessor (“μP”), a microcontroller (“μC”), a digital signal processor (“DSP”), or any combination thereof. Processor 804 may include one or more levels of caching, such as a level one cache 810 and a level two cache 812, a processor core 814, and registers 816. An example processor core 814 may include an arithmetic logic unit (“ALU”), a floating point unit (“FPU”), a digital signal processing core (“DSP core”), or any combination thereof. An example memory controller 818 may also be used with processor 804, or in some implementations memory controller 818 may be an internal part of processor 804.
  • Depending on the desired configuration, system memory 806 may be of any type including but not limited to volatile memory (such as RAM), non-volatile memory (such as ROM, flash memory, etc.) or any combination thereof. System memory 806 may include an operating system 820, one or more applications 822, and program data 824. Application 822 may include an application identifier component 826 that is arranged to identify applications corresponding to a recognized object. Program data 824 may include application attribute information 828, as is described herein. In some embodiments, application 822 may be arranged to operate with program data 824 on operating system 820 such that applications can be identified. This described basic configuration 802 is illustrated in FIG. 8 by those components within the inner dashed line.
  • Computing device 800 may have additional features or functionality, and additional interfaces to facilitate communications between basic configuration 802 and any required devices and interfaces. For example, a bus/interface controller 830 may be used to facilitate communications between basic configuration 802 and one or more data storage devices 832 via a storage interface bus 834. Data storage devices 832 may be removable storage devices 836, non-removable storage devices 838, or a combination thereof. Examples of removable storage and non-removable storage devices include magnetic disk devices such as flexible disk drives and hard-disk drives (“HDDs”), optical disk drives such as compact disk (“CD”) drives or digital versatile disk (“DVD”) drives, solid state drives (“SSDs”), and tape drives to name a few. Example computer storage media may include volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data.
  • System memory 806, removable storage devices 836 and non-removable storage devices 838 are examples of computer storage media. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVDs) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which may be used to store the desired information and which may be accessed by computing device 800. Any such computer storage media may be part of computing device 800.
  • Computing device 800 may also include an interface bus 840 for facilitating communication from various interface devices (e.g., output devices 842, peripheral interfaces 844, and communication devices 846) to basic configuration 802 via bus/interface controller 830. Example output devices 842 include a graphics processing unit 848 and an audio processing unit 850, which may be configured to communicate to various external devices such as a display or speakers via one or more A/V ports 852. Example peripheral interfaces 844 include a serial interface controller 854 or a parallel interface controller 856, which may be configured to communicate with external devices such as input devices (e.g., keyboard, mouse, pen, voice input device, touch input device, etc.) or other peripheral devices (e.g., printer, scanner, etc.) via one or more I/O ports 858. An example communication device 846 includes a network controller 860, which may be arranged to facilitate communications with one or more other computing devices 862 over a network communication link via one or more communication ports 864.
  • The network communication link may be one example of a communication media. Communication media may typically be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and may include any information delivery media. A “modulated data signal” may be a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media may include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency (“RF”), microwave, infrared (“IR”) and other wireless media. The term computer readable media as used herein may include both storage media and communication media.
  • Computing device 800 may be implemented as a portion of a small-form factor portable (or mobile) electronic device such as a cell phone, a personal data assistant (“PDA”), a personal media player device, a wireless web-watch device, a personal headset device, an application specific device, or a hybrid device that include any of the above functions. Computing device 800 may also be implemented as a personal computer including both laptop computer and non-laptop computer configurations.
  • Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims. Accordingly, the invention is not limited except as by the appended claims.

Claims (25)

We claim:
1. A method performed by a processor, the method comprising:
detecting at least one object of multiple objects in a scene;
identifying, based on the detected at least one object, one or more applications yet to be installed at a mobile computation device; and
placing one or more icons proximate to the detected at least one object in a display of the scene, wherein the one or more icons represent the identified one or more applications.
2. The method of claim 1, wherein detecting the at least one object includes employing a digital camera.
3. The method of claim 1, wherein the scene is displayed on an output device.
4. The method of claim 1, further comprising:
receiving contextual information, wherein identifying the one or more applications includes identifying the one or more applications further based on the received contextual information.
5. The method of claim 4, wherein receiving the contextual information comprises receiving the contextual information via an input from a user.
6. The method of claim 4, wherein the received contextual information is based on positional information.
7. The method of claim 1, wherein detecting the at least one object includes employing one or more image recognition methods.
8. The method of claim 1, wherein identifying the one or more applications includes searching a list of attributes, associated with a plurality of applications, for an attribute associated with the detected at least one object.
9. The method of claim 1, further comprising:
receiving a user selection of an icon of the one or more icons; and
installing an application associated with the user selected icon at the mobile computation device.
10. The method of claim 1, further comprising:
receiving an input to disassociate an application, of the identified one or more applications, from the detected at least one object.
11. The method of claim 1, further comprising:
adapting the one or more icons based on the detected at least one object, prior to placing the one or more icons proximate to the detected at least one object in the display of the scene.
12. A computer-readable storage device that stores instructions that, in response to execution by a processor, cause the processor to perform or control performance of operations to:
detect at least one object of multiple objects in a scene;
identify, based on the detected at least one object, one or more applications yet to be installed at a mobile computation device; and
place one or more icons proximate to the detected at least one object in a display of the scene, wherein the one or more icons represent the identified one or more applications.
13. The computer-readable storage device of claim 12, wherein the operation to detect the at least one object includes an operation to employ a digital camera.
14. The computer-readable storage device of claim 12, wherein the scene is displayed on an output device.
15. The computer-readable storage device of claim 12, wherein the stored instructions, in response to execution by a computer, cause the computer to perform or control performance of at least one operation to:
obtain contextual information, wherein the identification of the one or more applications includes identification of the one or more applications further based on the obtained contextual information.
16. The computer-readable storage device of claim 15, wherein the contextual information is received via an input from a user.
17. The computer-readable storage device of claim 15, wherein the contextual information is based on positional information.
18. A system, comprising:
a first component configured to detect at least one object of multiple objects in a scene;
a second component operatively coupled to the first component, wherein the second component is configured to identify, based on the detected at least one object, one or more applications yet to be installed at a mobile computation device; and
a third component operatively coupled to the second component, wherein the third component is configured to place one or more icons proximate to the detected at least one object in a display of the scene, wherein the one or more icons represent the identified one or more applications.
19. The system of claim 18, wherein the first component is configured to detect the at least one object by use of a digital camera.
20. The system of claim 18, wherein the scene is displayed on an output device.
21. The system of claim 18, further comprising:
another component operatively coupled to the second component, wherein the another component is configured to receive contextual information, and wherein the second component is configured to identify the one or more applications further based on the contextual information received by the another component.
22. The system of claim 18, wherein the first component is configured to detect the at least one object by use of one or more image recognition methods.
23. The system of claim 18, wherein the second component is configured to identify the one or more applications by search of a list of attributes, associated with a plurality of applications, for an attribute associated with the detected at least one object.
24. The system of claim 18, wherein placement of the one or more icons proximate to the detected at least one object comprises f the one or more icons being stacked proximate to the detected at least one object.
25. The system of claim 18, wherein the second component is configured to use features provided by the one or more applications to identify the one or more applications.
US15/633,636 2012-08-24 2017-06-26 Virtual reality applications Abandoned US20170308272A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/633,636 US20170308272A1 (en) 2012-08-24 2017-06-26 Virtual reality applications

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
PCT/US2012/052320 WO2014031126A1 (en) 2012-08-24 2012-08-24 Virtual reality applications
US201313821560A 2013-03-07 2013-03-07
US15/633,636 US20170308272A1 (en) 2012-08-24 2017-06-26 Virtual reality applications

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
US13/821,560 Division US9690457B2 (en) 2012-08-24 2012-08-24 Virtual reality applications
PCT/US2012/052320 Division WO2014031126A1 (en) 2012-08-24 2012-08-24 Virtual reality applications

Publications (1)

Publication Number Publication Date
US20170308272A1 true US20170308272A1 (en) 2017-10-26

Family

ID=50149152

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/821,560 Expired - Fee Related US9690457B2 (en) 2012-08-24 2012-08-24 Virtual reality applications
US15/633,636 Abandoned US20170308272A1 (en) 2012-08-24 2017-06-26 Virtual reality applications

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US13/821,560 Expired - Fee Related US9690457B2 (en) 2012-08-24 2012-08-24 Virtual reality applications

Country Status (4)

Country Link
US (2) US9690457B2 (en)
JP (1) JP6289468B2 (en)
KR (2) KR101844395B1 (en)
WO (1) WO2014031126A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019147021A1 (en) * 2018-01-23 2019-08-01 Samsung Electronics Co., Ltd. Device for providing augmented reality service, and method of operating the same
WO2020047259A1 (en) * 2018-08-31 2020-03-05 Snap Inc. Augmented reality anthropomorphization system
WO2020114756A1 (en) 2018-12-03 2020-06-11 Signify Holding B.V. Determining a control mechanism based on a surrounding of a remote controllable device
US11335067B2 (en) 2017-09-15 2022-05-17 Snap Inc. Augmented reality system
US11769307B2 (en) 2015-10-30 2023-09-26 Snap Inc. Image based tracking in augmented reality systems
US11972529B2 (en) 2019-02-01 2024-04-30 Snap Inc. Augmented reality system

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9924102B2 (en) * 2013-03-14 2018-03-20 Qualcomm Incorporated Image-based application launcher
EP2990920B1 (en) * 2013-04-22 2019-04-03 Fujitsu Limited Information terminal control method
KR102178892B1 (en) * 2014-09-15 2020-11-13 삼성전자주식회사 Method for providing an information on the electronic device and electronic device thereof
US10181219B1 (en) 2015-01-21 2019-01-15 Google Llc Phone control and presence in virtual reality
US10102674B2 (en) 2015-03-09 2018-10-16 Google Llc Virtual reality headset connected to a mobile computing device
KR102063895B1 (en) 2015-04-20 2020-01-08 삼성전자주식회사 Master device, slave device and control method thereof
US20170242675A1 (en) * 2016-01-15 2017-08-24 Rakesh Deshmukh System and method for recommendation and smart installation of applications on a computing device
WO2018118657A1 (en) 2016-12-21 2018-06-28 Pcms Holdings, Inc. Systems and methods for selecting spheres of relevance for presenting augmented reality information
US10909371B2 (en) 2017-01-19 2021-02-02 Samsung Electronics Co., Ltd. System and method for contextual driven intelligence
WO2018135881A1 (en) 2017-01-19 2018-07-26 Samsung Electronics Co., Ltd. Vision intelligence management for electronic devices
US10867181B2 (en) * 2017-02-20 2020-12-15 Pcms Holdings, Inc. Dynamically presenting augmented reality information for reducing peak cognitive demand
KR101932008B1 (en) * 2017-12-29 2018-12-24 (주)제이엘케이인스펙션 Image analysis apparatus and method based on feature and context of image
US11200402B2 (en) * 2018-01-26 2021-12-14 GICSOFT, Inc. Application execution based on object recognition
US11157130B2 (en) * 2018-02-26 2021-10-26 Adobe Inc. Cursor-based resizing for copied image portions
US10878037B2 (en) 2018-06-21 2020-12-29 Google Llc Digital supplement association and retrieval for visual search
US10579230B2 (en) 2018-06-21 2020-03-03 Google Llc Digital supplement association and retrieval for visual search
CN113272766A (en) * 2019-01-24 2021-08-17 麦克赛尔株式会社 Display terminal, application control system, and application control method
CN109814800A (en) * 2019-01-25 2019-05-28 努比亚技术有限公司 Footmark sweep-out method, mobile terminal and computer readable storage medium
US11089109B1 (en) * 2019-11-20 2021-08-10 Sprint Communications Company L.P. Smart device management via a mobile communication device based on privacy preferences
US20230177188A1 (en) * 2021-12-06 2023-06-08 Sap Se Transitioning from an integrated end-of-purpose protocol to an aligned purpose disassociation protocol

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5874966A (en) * 1995-10-30 1999-02-23 International Business Machines Corporation Customizable graphical user interface that automatically identifies major objects in a user-selected digitized color image and permits data to be associated with the major objects
US20120042036A1 (en) * 2010-08-10 2012-02-16 Microsoft Corporation Location and contextual-based mobile application promotion and delivery
US20130051615A1 (en) * 2011-08-24 2013-02-28 Pantech Co., Ltd. Apparatus and method for providing applications along with augmented reality data
US20130201215A1 (en) * 2012-02-03 2013-08-08 John A. MARTELLARO Accessing applications in a mobile augmented reality environment

Family Cites Families (71)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US1215601A (en) 1915-12-16 1917-02-13 Cora Belle Williams Corset.
US6037936A (en) * 1993-09-10 2000-03-14 Criticom Corp. Computer vision system with a graphic user interface and remote camera control
US5835094A (en) 1996-12-31 1998-11-10 Compaq Computer Corporation Three-dimensional computer environment
US6252597B1 (en) 1997-02-14 2001-06-26 Netscape Communications Corporation Scalable user interface for graphically representing hierarchical data
US7073129B1 (en) * 1998-12-18 2006-07-04 Tangis Corporation Automated selection of appropriate information based on a computer user's context
US8341553B2 (en) * 2000-02-17 2012-12-25 George William Reed Selection interface systems and methods
US7076503B2 (en) 2001-03-09 2006-07-11 Microsoft Corporation Managing media objects in a database
JP3871904B2 (en) 2001-06-07 2007-01-24 日立ソフトウエアエンジニアリング株式会社 How to display a dendrogram
US7653212B2 (en) 2006-05-19 2010-01-26 Universal Electronics Inc. System and method for using image data in connection with configuring a universal controlling device
WO2004003705A2 (en) * 2002-06-27 2004-01-08 Small World Productions, Inc. System and method for locating and notifying a user of a person, place or thing having attributes matching the user's stated prefernces
US20050039133A1 (en) * 2003-08-11 2005-02-17 Trevor Wells Controlling a presentation of digital content
US7313574B2 (en) 2003-10-02 2007-12-25 Nokia Corporation Method for clustering and querying media items
US7565139B2 (en) * 2004-02-20 2009-07-21 Google Inc. Image-based search engine for mobile phones with camera
US20050289158A1 (en) 2004-06-25 2005-12-29 Jochen Weiss Identifier attributes for product data stored in an electronic database
EP1810182A4 (en) 2004-08-31 2010-07-07 Kumar Gopalakrishnan Method and system for providing information services relevant to visual imagery
US8370769B2 (en) * 2005-06-10 2013-02-05 T-Mobile Usa, Inc. Variable path management of user contacts
US7728869B2 (en) 2005-06-14 2010-06-01 Lg Electronics Inc. Matching camera-photographed image with map data in portable terminal and travel route guidance method
US20070050468A1 (en) 2005-08-09 2007-03-01 Comverse, Ltd. Reality context menu (RCM)
US7836065B2 (en) * 2005-11-01 2010-11-16 Sap Ag Searching multiple repositories in a digital information system
US7725077B2 (en) 2006-03-24 2010-05-25 The Invention Science Fund 1, Llc Wireless device with an aggregate user interface for controlling other devices
US7913192B2 (en) * 2006-03-31 2011-03-22 Research In Motion Limited Methods and apparatus for retrieving and displaying map-related data for visually displayed maps of mobile communication devices
US7752207B2 (en) * 2007-05-01 2010-07-06 Oracle International Corporation Crawlable applications
US8290513B2 (en) * 2007-06-28 2012-10-16 Apple Inc. Location-based services
US8281240B2 (en) 2007-08-23 2012-10-02 International Business Machines Corporation Avatar aggregation in a virtual universe
US8180396B2 (en) 2007-10-18 2012-05-15 Yahoo! Inc. User augmented reality for camera-enabled mobile devices
US8264505B2 (en) 2007-12-28 2012-09-11 Microsoft Corporation Augmented reality and filtering
US8762285B2 (en) 2008-01-06 2014-06-24 Yahoo! Inc. System and method for message clustering
US9503562B2 (en) * 2008-03-19 2016-11-22 Universal Electronics Inc. System and method for appliance control via a personal communication or entertainment device
US20090262084A1 (en) * 2008-04-18 2009-10-22 Shuttle Inc. Display control system providing synchronous video information
US9870130B2 (en) * 2008-05-13 2018-01-16 Apple Inc. Pushing a user interface to a remote device
US9311115B2 (en) * 2008-05-13 2016-04-12 Apple Inc. Pushing a graphical user interface to a remote device with display rules provided by the remote device
US8711176B2 (en) * 2008-05-22 2014-04-29 Yahoo! Inc. Virtual billboards
US20090322671A1 (en) 2008-06-04 2009-12-31 Cybernet Systems Corporation Touch screen augmented reality system and method
US8260320B2 (en) 2008-11-13 2012-09-04 Apple Inc. Location specific content
US9342231B2 (en) * 2008-12-29 2016-05-17 Apple Inc. Remote control of a presentation
US7870496B1 (en) * 2009-01-29 2011-01-11 Jahanzeb Ahmed Sherwani System using touchscreen user interface of a mobile device to remotely control a host computer
US20130124311A1 (en) * 2009-03-23 2013-05-16 Sujai Sivanandan System and Method for Dynamic Integration of Advertisements in a Virtual Environment
JP4829381B2 (en) * 2009-05-18 2011-12-07 隆敏 柳瀬 Knowledge base system, logical operation method, program, and recording medium
WO2011000046A1 (en) 2009-07-01 2011-01-06 Ozmota Inc. Systems and methods for determining information and knowledge relevancy, relevant knowledge discovery and interactions, and knowledge creation
KR101595762B1 (en) * 2009-11-10 2016-02-22 삼성전자주식회사 Method for controlling remote of portable terminal and system for the same
US8850342B2 (en) 2009-12-02 2014-09-30 International Business Machines Corporation Splitting avatars in a virtual world
KR101657565B1 (en) * 2010-04-21 2016-09-19 엘지전자 주식회사 Augmented Remote Controller and Method of Operating the Same
KR20110118421A (en) * 2010-04-23 2011-10-31 엘지전자 주식회사 Augmented remote controller, augmented remote controller controlling method and the system for the same
US20110161875A1 (en) 2009-12-29 2011-06-30 Nokia Corporation Method and apparatus for decluttering a mapping display
US8725706B2 (en) * 2010-03-26 2014-05-13 Nokia Corporation Method and apparatus for multi-item searching
US8990702B2 (en) * 2010-09-30 2015-03-24 Yahoo! Inc. System and method for controlling a networked display
US9021354B2 (en) * 2010-04-09 2015-04-28 Apple Inc. Context sensitive remote device
US9361729B2 (en) 2010-06-17 2016-06-07 Microsoft Technology Licensing, Llc Techniques to present location information for social networks using augmented reality
US20110316845A1 (en) * 2010-06-25 2011-12-29 Palo Alto Research Center Incorporated Spatial association between virtual and augmented reality
KR101329882B1 (en) 2010-08-12 2013-11-15 주식회사 팬택 Apparatus and Method for Displaying Augmented Reality Window
US20120075433A1 (en) 2010-09-07 2012-03-29 Qualcomm Incorporated Efficient information presentation for augmented reality
US9710554B2 (en) 2010-09-23 2017-07-18 Nokia Technologies Oy Methods, apparatuses and computer program products for grouping content in augmented reality
US20120212405A1 (en) * 2010-10-07 2012-08-23 Benjamin Zeis Newhouse System and method for presenting virtual and augmented reality scenes to a user
JP5257437B2 (en) 2010-10-20 2013-08-07 コニカミノルタビジネステクノロジーズ株式会社 Method for operating portable terminal and processing device
KR101357260B1 (en) 2010-10-22 2014-02-03 주식회사 팬택 Apparatus and Method for Providing Augmented Reality User Interface
US8698843B2 (en) * 2010-11-02 2014-04-15 Google Inc. Range of focus in an augmented reality application
US8952983B2 (en) 2010-11-04 2015-02-10 Nokia Corporation Method and apparatus for annotating point of interest information
US20120113223A1 (en) 2010-11-05 2012-05-10 Microsoft Corporation User Interaction in Augmented Reality
US8913085B2 (en) 2010-12-22 2014-12-16 Intel Corporation Object mapping techniques for mobile augmented reality applications
US20120173979A1 (en) * 2010-12-31 2012-07-05 Openpeak Inc. Remote control system and method with enhanced user interface
US20120182205A1 (en) * 2011-01-18 2012-07-19 Schlumberger Technology Corporation Context driven heads-up display for efficient window interaction
US8929591B2 (en) * 2011-03-08 2015-01-06 Bank Of America Corporation Providing information associated with an identified representation of an object
KR101829063B1 (en) 2011-04-29 2018-02-14 삼성전자주식회사 Method for displaying marker in map service
US9727132B2 (en) * 2011-07-01 2017-08-08 Microsoft Technology Licensing, Llc Multi-visor: managing applications in augmented reality environments
TWI452527B (en) * 2011-07-06 2014-09-11 Univ Nat Chiao Tung Method and system for application program execution based on augmented reality and cloud computing
US9058331B2 (en) 2011-07-27 2015-06-16 Ricoh Co., Ltd. Generating a conversation in a social network based on visual search results
JP5987299B2 (en) * 2011-11-16 2016-09-07 ソニー株式会社 Display control apparatus, display control method, and program
US20130155108A1 (en) * 2011-12-15 2013-06-20 Mitchell Williams Augmented Reality User Interaction Methods, Computing Devices, And Articles Of Manufacture
WO2014035367A1 (en) 2012-08-27 2014-03-06 Empire Technology Development Llc Generating augmented reality exemplars
US10001918B2 (en) * 2012-11-21 2018-06-19 Algotec Systems Ltd. Method and system for providing a specialized computer input device
US10025486B2 (en) * 2013-03-15 2018-07-17 Elwha Llc Cross-reality select, drag, and drop for augmented reality systems

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5874966A (en) * 1995-10-30 1999-02-23 International Business Machines Corporation Customizable graphical user interface that automatically identifies major objects in a user-selected digitized color image and permits data to be associated with the major objects
US20120042036A1 (en) * 2010-08-10 2012-02-16 Microsoft Corporation Location and contextual-based mobile application promotion and delivery
US20130051615A1 (en) * 2011-08-24 2013-02-28 Pantech Co., Ltd. Apparatus and method for providing applications along with augmented reality data
US20130201215A1 (en) * 2012-02-03 2013-08-08 John A. MARTELLARO Accessing applications in a mobile augmented reality environment

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11769307B2 (en) 2015-10-30 2023-09-26 Snap Inc. Image based tracking in augmented reality systems
US11335067B2 (en) 2017-09-15 2022-05-17 Snap Inc. Augmented reality system
US11721080B2 (en) 2017-09-15 2023-08-08 Snap Inc. Augmented reality system
WO2019147021A1 (en) * 2018-01-23 2019-08-01 Samsung Electronics Co., Ltd. Device for providing augmented reality service, and method of operating the same
WO2020047259A1 (en) * 2018-08-31 2020-03-05 Snap Inc. Augmented reality anthropomorphization system
US10997760B2 (en) 2018-08-31 2021-05-04 Snap Inc. Augmented reality anthropomorphization system
US11450050B2 (en) 2018-08-31 2022-09-20 Snap Inc. Augmented reality anthropomorphization system
US11676319B2 (en) 2018-08-31 2023-06-13 Snap Inc. Augmented reality anthropomorphtzation system
WO2020114756A1 (en) 2018-12-03 2020-06-11 Signify Holding B.V. Determining a control mechanism based on a surrounding of a remote controllable device
US11475664B2 (en) 2018-12-03 2022-10-18 Signify Holding B.V. Determining a control mechanism based on a surrounding of a remove controllable device
US11972529B2 (en) 2019-02-01 2024-04-30 Snap Inc. Augmented reality system

Also Published As

Publication number Publication date
KR20180035243A (en) 2018-04-05
US20140059458A1 (en) 2014-02-27
KR101844395B1 (en) 2018-04-02
KR20150046258A (en) 2015-04-29
WO2014031126A1 (en) 2014-02-27
US9690457B2 (en) 2017-06-27
JP6289468B2 (en) 2018-03-07
JP2015531135A (en) 2015-10-29

Similar Documents

Publication Publication Date Title
US20170308272A1 (en) Virtual reality applications
US9904906B2 (en) Mobile terminal and data provision method thereof
US8806373B2 (en) Display control apparatus, display control method, display control program, and recording medium storing the display control program
KR101753031B1 (en) Mobile terminal and Method for setting metadata thereof
EP2432221B1 (en) Mobile terminal, electronic system and method of transmitting and receiving data using the same
KR102027899B1 (en) Method and apparatus for providing information using messenger
US10387510B2 (en) Content search method and electronic device implementing same
JP6487149B2 (en) Application providing method for portable terminal, electronic apparatus therefor, and computer-readable storage medium storing a program for executing the same
KR102178892B1 (en) Method for providing an information on the electronic device and electronic device thereof
CN110168487B (en) Touch control method and device
US20140240551A1 (en) Apparatus and method for synthesizing an image in a portable terminal equipped with a dual camera
CN108781235A (en) A kind of display methods and device
US10990748B2 (en) Electronic device and operation method for providing cover of note in electronic device
CN110865756B (en) Image labeling method, device, equipment and storage medium
US20150205468A1 (en) Method and apparatus for controlling user interface
US20160018984A1 (en) Method of activating user interface and electronic device supporting the same
US20150177957A1 (en) Method and apparatus for processing object provided through display
US9430806B2 (en) Electronic device and method of operating the same
US9792183B2 (en) Method, apparatus, and recording medium for interworking with external terminal
US9429447B2 (en) Method of utilizing image based on location information of the image in electronic device and the electronic device thereof
JP6356869B2 (en) Virtual reality application
US11237706B2 (en) Information processing method and terminal
US20180074697A1 (en) Method for outputting screen according to force input and electronic device supporting the same
KR20180027886A (en) Electronic device and operating method thereof

Legal Events

Date Code Title Description
AS Assignment

Owner name: CRESTLINE DIRECT FINANCE, L.P., TEXAS

Free format text: SECURITY INTEREST;ASSIGNOR:EMPIRE TECHNOLOGY DEVELOPMENT LLC;REEL/FRAME:048373/0217

Effective date: 20181228

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION