US20160321840A1 - Systems, methods, and computer program products for navigating through a virtual/augmented reality - Google Patents

Systems, methods, and computer program products for navigating through a virtual/augmented reality Download PDF

Info

Publication number
US20160321840A1
US20160321840A1 US15/211,721 US201615211721A US2016321840A1 US 20160321840 A1 US20160321840 A1 US 20160321840A1 US 201615211721 A US201615211721 A US 201615211721A US 2016321840 A1 US2016321840 A1 US 2016321840A1
Authority
US
United States
Prior art keywords
mobile device
mode display
user
response
street view
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/211,721
Inventor
Chad Anthony Geraci
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
eBay Inc
Original Assignee
eBay Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by eBay Inc filed Critical eBay Inc
Priority to US15/211,721 priority Critical patent/US20160321840A1/en
Publication of US20160321840A1 publication Critical patent/US20160321840A1/en
Assigned to EBAY INC. reassignment EBAY INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GERACI, CHAD ANTHONY
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3605Destination input or retrieval
    • G01C21/362Destination input or retrieval received from an external device or application, e.g. PDA, mobile phone or calendar application
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/248Presentation of query results
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9537Spatial or temporal dependent retrieval, e.g. spatiotemporal queries
    • G06F17/30554
    • G06F17/3087
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0251Targeted advertisements
    • G06Q30/0259Targeted advertisements based on store location
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0251Targeted advertisements
    • G06Q30/0261Targeted advertisements based on user location
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Definitions

  • the present disclosure generally relates to navigating through a virtual or augmented reality and, more particularly, to utilizing a mobile device to navigate through a virtual or augmented reality street view that has search results layered over the display.
  • Online transactions are becoming more and more prevalent, with an ever-increasing number of online entities that may or may not have a physical real world counterpart.
  • the popularity of online transactions is partially attributable to the ease and convenience of making a transaction online instead of at a physical commercial location.
  • mobile electronic communication devices and applications have been developed that are specifically aimed at commerce as well.
  • these mobile devices may include smart phones, computer tablets, or laptops.
  • Many of these mobile devices have increased capabilities as they have been equipped with, among other hardware devices, a Global Positioning System (GPS), a compass, a digital camera.
  • GPS Global Positioning System
  • compass compass
  • digital camera a digital camera
  • FIG. 1 is an illustration showing a user at a physical location receiving content from a content provider, according to various aspects of the present disclosure.
  • FIG. 2 is an illustration of a mobile device, according to various aspects of the present disclosure.
  • FIG. 3A is an illustration of a side view of a mobile device, according to various aspects of the present disclosure.
  • FIG. 3B is an illustration of a top view of a mobile device, according to various aspects of the present disclosure.
  • FIG. 4A is an illustration of an example method, according to various aspects of the present disclosure.
  • FIG. 4B is an illustration of an example method, which may be performed by an application on a mobile device, according to various aspects of the present disclosure.
  • FIG. 5 is a block diagram of an example computer system suitable for implementing various methods and devices described, according to various aspects of the present disclosure.
  • the system includes a display interface operable to receive an input from a user and communicate an output to the user, a transceiver operable to electronically communicate with a content provider, a computer processor operable to execute instructions, and a memory storage operable to store the instructions.
  • the memory storage further comprising a program module that is operable to: submit a search request regarding a specific item at the content provider; receive search data regarding the specific item from the content provider; display a virtual reality street view on the display interface, the virtual reality street view having the search data regarding the specific item overlaid thereon; and navigate through the virtual reality street view by performing navigation functions.
  • the method includes interacting with a provider of digital content to request data regarding a specific item of interest using a mobile device associated with a user, receiving the requested data regarding the specific item of interest from the provider of digital content on the mobile device, rendering the received data regarding the specific item of interest on a display of the mobile device, the rendering including superimposing the received data on a street view, and navigating through the street view according to navigation functions of the mobile device to locate a physical commercial location that carries the specific item of interest.
  • the electronic device includes a non-transitory, tangible computer readable storage medium storing a computer program.
  • the computer program contains instructions that when executed perform: receiving data representative of a specific item, the data representative of a specific item received at the mobile device, which is associated with a user; based on the received data representative of the specific item, submitting a search request at a content provider to determine a physical commercial location that carries the specific item; receiving search data regarding the specific item from the content provider, the received search data regarding the specific item including data regarding a physical commercial location that carries the specific item; overlaying the search data regarding the specific item on a street view on the display of the mobile device; and navigating through the street view according to navigation functions of the mobile device.
  • one embodiment includes methods whereby a user of a mobile device uses the mobile device to search for a specific item. After the specific item has been found, the mobile device renders a map and/or a street view of a virtual or an augmented reality with an overlay of the physical location of the retailers that have that specific item. The user, thereafter, may use the mobile device to navigate through the street view of the virtual or the augmented reality to identify a desirable brick-and-mortar retailer that has that specific item.
  • Another instance includes embodiments such as a provider-side and a user side applications and devices to provide and receive, respectively, the digital data including specific item data, map and/or street view data, and overlay data which the mobile device can merges to thereby allow the user to navigate and identify a desirable brick-and-mortar retailer that has that specific item.
  • embodiments such as a provider-side and a user side applications and devices to provide and receive, respectively, the digital data including specific item data, map and/or street view data, and overlay data which the mobile device can merges to thereby allow the user to navigate and identify a desirable brick-and-mortar retailer that has that specific item.
  • various embodiments combine the digital realm with the physical realm.
  • a user seeking to participate in a brick-and-mortar channel of commerce receives digital content on a mobile device and, thus, also participates in an on-line, mobile channel of commerce.
  • FIG. 1 represents an illustration of a system 100 according to various aspects of the present disclosure.
  • a user 101 is at physical location 102 and has mobile device 104 .
  • the mobile device 104 has access to a network 106 which is used to communicate with one or more on-line content providers 108 .
  • the on-line content providers 108 provide digital data to the mobile device 104 of the user 101 via network 106 , according to embodiments described further below.
  • Physical location 102 may be a static physical location (i.e., the user 101 /mobile device 104 are not moving) or a dynamic physical location (i.e., the user 101 /mobile device 104 are moving).
  • physical location 102 is initially a static physical location such as coffee house, a hotel lobby, etc.
  • the physical location may change dynamically after a period of time.
  • the mobile device 104 has a network connection at the physical location 102 , according to various network connection methods discussed below.
  • Mobile device 104 may include any type of mobile device configured to access digital data over a network. Examples include a notebook/laptop computer, a tablet computer (such as an iPadTM tablet computer), an MP3 player (such as an iPodTM music player), a e-book reader (such as the KindleTM reader), a smartphone (such as the iPhoneTM phone) and/or the like. Mobile device 104 includes at least one network connection operable to communicate with at least one content provider 108 over network 106 . Examples of network connections include 3G/4G/LTE cellular wireless connections, 802.11 (Wi-Fi) connections to a LAN, WAN, or the Internet, a wired connection (such as by Ethernet), and/or the like.
  • Wi-Fi 802.11
  • mobile device 104 may be equipped with cameras, Global Positioning System (GPS) transceivers, and various kinds of sensors such as accelerometers, proximity sensors, ambient light sensors, compasses, gyroscopes, etc.
  • GPS Global Positioning System
  • the mobile device 104 may further include programs and/or applications stored on computer readable medium that utilizes the above disclosed equipment for input/output (I/O) of data and to determine the physical location 102 of the user 101 and the mobile device 104 .
  • I/O input/output
  • User 101 may access content provided by on-line content provider 108 by, e.g., accessing content of content provider 108 through a web browser or a specialized application on mobile device 104 .
  • user 101 may be seeking to purchase a specific item at a brick-and-mortar retailer nearby.
  • the user 101 may gain access to the network 106 and thereafter direct a web browser of the mobile device 104 to a content provider 108 website.
  • the content provider 108 may be any content provider such as Google, Yahoo, MSN, MILO, etc . . .
  • the user 101 submits a search request for the specific item sought.
  • the content provider 108 provides search result data regarding the specific item to the mobile device 104 via network 106 .
  • the user 101 utilizes the application executing on the mobile device 104 to submit the search request to the content provider 108 , which in response provides search result data regarding the specific item to the user's 101 mobile device 104 via network 106 .
  • the user 101 may also utilize other means of searching for the specific item.
  • other means of searching for the specific item include scanning a code representative of the item searched by executing a program or an application that uses the camera of the mobile device 104 .
  • Codes representative of the item searched may include a UPC code, an EAN code, a QU code, or any other appropriate code.
  • the scanning may be performed using the mobile device 104 executing code scanning programs and/or applications such as RedLaser®, Code Scanner®, or any appropriate code scanning programs and/or applications.
  • Such applications may be downloaded from both Apple's ® App Store and Google's ® Android Market.
  • the program and/or application contacts the content provider 108 over the network 106 and submits a search request for the specific item.
  • the content provider 108 returns search results data regarding the specific item to the mobile device 104 of the user via network 106 .
  • other means of searching for the specific item include visual identification of the specific item using spatial or pattern recognition.
  • the mobile device 104 may execute a spatial or pattern recognition program and/or application, which is stored on computer readable medium of the mobile device 104 .
  • the spatial or pattern recognition program and/or application may utilize the camera of the mobile device 104 to obtain a digital image of the specific item desired by the user 101 .
  • the program and/or application performs spatial or pattern recognition to identify the specific item and contacts the content provider 108 over the network 106 and submits a search request for the specific item.
  • the content provider 108 returns search results data regarding the specific item to the mobile device 104 of the user via network 106 .
  • the program and/or application submits that digital image to the content provider 108 , which performs spatial or pattern recognition to identify the specific item and thereafter returns search results data regarding the specific item to the mobile device 104 of the user via network 106 .
  • content provider 108 downloads the search results data to the mobile device 104
  • content provider 108 streams the search results data to the mobile device 104
  • the search results data may include data regarding the location/address of brick-and- mortar retailers that have the specific item in inventory, contact information of retailers, number of items in inventory, size in inventory, price of item, product names, ratings of the product, etc.
  • the location of the brick-and-mortar retailer, provided in the search results may be based on the location of the user 101 and the mobile device 104 .
  • the content provider 108 may limit the search results to brick-and-mortar retailers near the user 101 /mobile device 104 .
  • the various embodiments may use any such technique now known or later developed.
  • the user 101 may utilize the provided search results to navigate through a virtual/augmented reality street view in order to locate a nearby brick-and-mortar retailer that has the specific item in inventory, as will be readily apparent from the disclosure set forth below.
  • a program and/or application on mobile device 104 and/or a program running on a computer at physical location 102 or content provider 108 verifies the location of user 101 .
  • mobile device 104 is enabled to the Global Positioning System (GPS) or other satellite-based location service
  • GPS Global Positioning System
  • a GPS receiver built into mobile device 104 discerns the location of the mobile device 104 .
  • a program and/or application on mobile device 104 and/or a program at content provider 108 analyzes location information received from the GPS receiver and makes a determination of the location of the user 101 /mobile device 104 to provide location based search results.
  • mobile device 104 communicates with cell towers nearby (for example through a cellular band or mode).
  • the cell towers can be used to triangulate the position based on communication with the mobile device 104 of the user 101 .
  • the content provider 108 may ascertain the physical location and provide location based search results.
  • mobile device 104 is configured to connect to a network at the physical location 102 , so that mobile device 104 is assigned an Internet Protocol (IP) address.
  • IP Internet Protocol
  • the IP address may be received and analyzed by an application on mobile device 104 and/or by the content provider 108 .
  • content provider 108 may ascertain the physical location of the user 101 and the mobile device 104 and provide location based search results.
  • user 101 enters a physical location within the mobile device 104 which is used by the content provider 108 to provide location based search results. Any technique now known or later developed to discern the physical location of the user 101 and the mobile device 104 may be adapted for use in various embodiments.
  • the user 101 Upon receiving the search result data regarding the specific item, in one embodiment, the user 101 utilizes a mapping program and/or application to locate nearby brick-and-mortar retailers that have the specific item in inventory.
  • the search result data may be rendered in a map view that shows a topical area such as streets, the user 101 location, and the location of the brick-and-mortar retailers that have the specific item in stock.
  • the mapping program and/or application may be stored in whole or in part on computer readable medium of the mobile device 104 of the user 101 , and/or may be downloaded or streamed to the mobile device 104 from the content provider 108 .
  • the content provider 108 that has provided the search results data may be the same or different from the content provider that provides the mapping data.
  • the content provider that provides the search results data may be MILO and the content provider that provides the mapping data may be Google.
  • Google may provide both the search results data and the mapping data to the mobile device 104 of the user 101 .
  • the user may decide to switch to a street view 200 and navigate through a virtual reality that displays images of the street at or near the user's 101 current physical location 102 .
  • navigating through the virtual reality street view may be performed in a static and/or in a dynamic mode.
  • the street view includes an overlay of brick-and-mortar stores 202 that have the specific item in stock (e.g., Maud's Fashion, Jackie's Fashion, etc . . . ). Additional information, which was provided in the search results, may be presented to the user, while in street view.
  • the number of items in stock (e.g., Viesal Jacket (3) in stock) is displayed.
  • the user may select a button 204 related to the brick-and-mortar retailer 202 to view additional information related to the specific item and/or the retailer 202 .
  • Additional information may be information such as retailer contact information (e.g., phone number/website/email), size in inventory, price of item, product names, ratings of the product, etc.
  • a portion of the street view display may be configured to provide user instructions (e.g., Tilt phone to navigate, etc . . . ).
  • FIG. 3A illustrates a schematic side view of the mobile device 104 according to various aspects of the present disclosure
  • FIG. 3B illustrates a schematic top view of the mobile device 104 according to various aspects of the present disclosure.
  • the user may navigate through the street view longitudinally according to various methods. For instance, the user may navigate forward through the street view by tilting the mobile device 104 in a forward motion 302 to angle ⁇ 1, about the x-axis. Further, the user may navigate backward through the street view by tilting the mobile device 104 in a backward motion 304 to angle ⁇ 2, about the x-axis.
  • navigating through the street view will occur at a faster pace.
  • a fixed angle position e.g., zero angle position
  • navigating through the street view will occur at a slower pace.
  • there is a hysteresis where a small angle change from the fixed angle position does not affect navigation.
  • the speed that the mobile device 104 has been tilted about the x-axis affects the pace of navigation through the street view.
  • the duration that the mobile device 104 is tilted about the x-axis affects the pace of navigation through the street view.
  • the rate or pace of forward/backward movement through the street view may be a function of the angle of tilt, speed of tilt, duration of tilt, or a combination of these factors.
  • the tilt angle may range from about a positive 90 degrees (for forward movement) to about a negative 90 degrees (for backward movement). In the illustrated embodiment, the angle ranges from about a positive 45 degrees (for forward movement ⁇ 1) to about a negative 45 degrees (for backward movement ⁇ 2).
  • the tilt angle, tilt speed, and tilt duration may be determined by utilizing gyroscope and clock hardware of the mobile device 104 and various programs and/or applications installed on computer readable medium and executed on the mobile device 104 .
  • the user may navigate through the virtual reality angularly according to various methods. For instance, the user may turn right through the street view by rotating the mobile device 104 toward the right 306 to angle ⁇ 3, about the y-axis. Further, the user may turn left through the street view by rotating the mobile device 104 toward the left 308 to angel ⁇ 4,about the y-axis. In further embodiments, by rotating the mobile device 104 to a greater rotation angle from the fixed angle position (e.g., zero angle position) about the y-axis, turning through the street view will occur at a faster pace.
  • the fixed angle position e.g., zero angle position
  • the mobile device 104 by rotating the mobile device 104 by a lesser rotation angle from the fixed angle position about the y-axis, turning through the street view will occur at a slower pace. In certain examples, there is a hysteresis where a small angle change from the fixed angle position does not affect navigation. In further examples, the speed that the mobile device 104 has been rotated affects the rate of turning through the street view. In still further examples, the duration that the mobile device 104 is rotated affects the rate of turning through the street view.
  • the rate of turning through the street view may be a function of the angle of rotation, speed of rotation, duration of rotation, or a combination of these factors.
  • the rotation angle may range from about a positive 90 degrees (for right turns) to about a negative 90 degrees (for left turns). In the illustrated embodiment, the angle ranges from about a positive 45 degrees (for right turns ⁇ 3) to about a negative 45 degrees (for left turns ⁇ 4).
  • the rotation angle, rotation speed, and rotation duration may be determined by utilizing compass and clock hardware of the mobile device 104 and various programs and/or applications installed on computer readable medium and executed on the mobile device 104 .
  • buttons which are illustrated as arrow keys 206 are provided on the display that activated by a user, e.g., by pressing, in order to navigate through the virtual reality street view.
  • the arrow keys 206 point forward, backward, left, and right for navigation through the street view.
  • buttons that have another form e.g., triangular, round, oval, square, rectangular, etc . . .
  • buttons that have another form are located at substantial similar locations and perform substantially similar functions as the arrow keys.
  • the user may navigate through the virtual reality street view by utilizing sliding motions on the display.
  • the mobile device 104 may be configured such that a vertical sliding motion (e.g., from the bottom toward the top of the display) provides forward motion, another vertical sliding motion (e.g., from the top toward the bottom of the display) provides backward motion, a longitudinal sliding motion (e.g., from the left toward the right of the display) provides right turns, and another longitudinal sliding motion (e.g., from the right toward the left of the display) provides left turns.
  • the sliding motions may be reversed if it is desirable such that a sliding motion from the bottom toward the top provides backward motion, a sliding motion from the top toward the bottom provides forward motion, a sliding motion from the left toward the right provides left turns, and a sliding motion from the right toward the left provides right turns.
  • a two-finger expand motion is utilized for forward movement and a two-finger pinch motion is utilized for backward movement.
  • each of the above described navigation methods are independent and may be enabled and/or utilized separately or together via software/hardware methods.
  • the tilting/rotation functions may be utilized separately without the sliding functions being enabled.
  • the tilting/rotation functions may be utilized along with the sliding functions to provide additional means for navigation.
  • the user 101 can navigate through a virtual reality street view (utilizing various methods described above) to search for desirable brick-and-mortar retailers located near the user's 101 current physical location 102 .
  • a virtual reality street view utilizing various methods described above
  • the user 101 may elect to leave the current physical location 102 to purchase the specific item from the identified brick-and-mortar retailer.
  • a dynamic mode will be entered where the mobile device 104 utilizes cellular band and modes such as 3G, 4G, LTE, or any other appropriate technique now known or later developed that allows for providing content while the user 101 and the mobile device 104 are in a transient state. While in dynamic mode, the mobile device 104 may continue to operate as previously described, or may operate in a dynamic navigation mode, depending on user selected preferences of the application and/or program. For example, where the mobile device 104 continues to operate as previously described according to selected preferences, the street view will not be updated unless the user 101 performs navigation functions as described above.
  • the mobile device 104 will utilize physical positioning methods (as described above) to determine and update the user 101 and the mobile device 104 positions and thereafter update the navigation street view accordingly.
  • the street view on the display of the mobile device 104 is continuously or periodically updated so that the street view represents the current position of the user 101 .
  • the user may still perform static mode functions for navigation through the street view, after which the street view will revert back to dynamic mode. Reverting back to dynamic mode may be initiated by the user or may be performed automatically (e.g., a timer expires since the user 101 has performed navigation functions).
  • the mobile device 104 operates in an augmented reality mode that has an overlay of the search results regarding the specific item. For example, after a search for the specific item has been performed and the search results data from the content provider 108 has been received by the mobile device 108 , the user 101 may initiate an augmented reality mode by utilizing a program and/or application stored on the mobile device 104 .
  • the augmented reality program and/or application utilizes the camera of the mobile device 104 to render a current street view (e.g., as the user 101 is walking down the street).
  • the current street view is overlaid in real-time with the search results data regarding the specific item, in a manner substantially the same as that described above with regards to the virtual reality street view that is map based.
  • the augmented reality street view (including the overlay) is navigated/updated in real-time as the image from the camera of the mobile device 104 changes (e.g., as the user 101 continues to walk down the street).
  • a real-time augmented reality street view having an overlay of the search results data regarding the specific item, is provided on the display of the mobile device 104 .
  • the various embodiments described above provide one or more advantages. For instance, the user utilizing the above described mobile device will experience a more efficient shopping experience as only the brick-and-mortar retailers that have the specific item in stock will be considered and visited, thereby saving considerable time. Further, the user will be able to determine the retailer that has the best price for a specific item.
  • the brick-and-mortar retailer may benefit by increasing its exposure to customers that would have been on- line shoppers, thereby driving more customers to its brick-and-mortar retail locations.
  • the retailer may receive additional traffic within the store, thereby increasing the probability of additional transactions that may result from impulse buys.
  • FIG. 4A is an illustration of process 400 according to one embodiment.
  • Process 400 may be performed, for example, by a human user and/or a mobile device of the user.
  • communication occurs with a provider of digital content to request data regarding a specific item of interest on a mobile device associated with a user of the mobile device.
  • the communication is between the user and/or the user's mobile device and the content provider.
  • Such communicating may include providing an interface to the user and/or sending messages between the user's mobile device and the content provider to access the data of the content provider over a network.
  • the content provider may be, for example, an on-line content provider, as discussed above.
  • the content provider may return search results based on the user/mobile device physical location, which may be determined according the methods discussed above.
  • the requested data regarding the specific item of interest is received from the provider of digital content on the mobile device.
  • Receiving the requested data may include downloading the requested data to the mobile device and/or streaming the requested data to the mobile device.
  • the received data regarding the specific item of interest is rendered on a display of the mobile device.
  • the rendering includes superimposing the received data on a street view.
  • the street view is based at least in part on a physical location received from an interaction with the user.
  • the street view is navigated according to navigation functions of the mobile device.
  • the navigation may be performed to determine a physical commercial location that has the specific item of interest.
  • the user may perform the navigation functions with the mobile device as described above with reference to FIGS. 2-3 .
  • FIG. 4B is an illustration of exemplary process 450 , adapted according to one embodiment, which may be performed by an application on a user's mobile device.
  • data representative of a specific item is received.
  • the data representative of the specific item is received at the mobile device of a user.
  • the user may provide the data representative of the specific item by manually inputting a search string, scanning a code, providing a digital image, or providing other data representative of the specific item.
  • the application may provide an interface to apprise the user that the data representative of a specific item has been received and to interact with a content provider and/or other third parties.
  • a search request is submitted at a content provider to determine physical commercial locations that carry the specific item.
  • Submitting the search request at the content provider may include establishing a network connection and transferring data to the content provider to initiate the search.
  • the content provider may be, for example, an on-line content provider, as discussed above.
  • search data regarding the specific item is received from the content provider.
  • the received search data includes data regarding a physical commercial location that carries the specific item.
  • the physical commercial location may carry the specific item in stock or may carry the specific item by providing special ordering.
  • the content provider may provide search results based on the user/mobile device physical location, which may be determined according the methods discussed above.
  • the search data regarding the specific item is overlaid on a street view on the display of the mobile device. Overlaying the search data regarding the specific item may include providing the name of one or more physical commercial locations, the number of items in stock, price per item, size in stock, etc., as described above.
  • the street view is navigated by performing navigation functions.
  • the user may perform the navigation functions with the mobile device as described above with reference to FIGS. 2-3 .
  • the processes 400 and 450 may include additional steps that may be performed before, during, or after actions described above. For example, before the access to the network and/or content provider is granted to the user, the user may be required to enter a correct combination of a username and a password. In some instances, the user is prompted to become a member, if the user is not already a member.
  • FIG. 5 is a block diagram of an example computer system 500 suitable for implementing various methods and devices described herein, for example, the various method blocks of the method 400 and 450 .
  • user devices may comprise a network communications device (e.g., mobile cellular phone, laptop, personal computer, tablet, etc.) capable of communicating with a network
  • a content provider device may comprise a network computing device (e.g., a network server, a computer processor, an electronic communications interface, etc).
  • a network computing device e.g., a network server, a computer processor, an electronic communications interface, etc.
  • the computer system 500 such as a mobile communications device and/or a network server, includes a bus component 502 or other communication mechanisms for communicating information, which interconnects subsystems and components, such as processing component 504 (e.g., processor, micro-controller, digital signal processor (DSP), etc.), system memory component 506 (e.g., RAM), static storage component 508 (e.g., ROM), disk drive component 510 (e.g., magnetic or optical), network interface component 512 (e.g., modem or Ethernet card), display component 514 (e.g., cathode ray tube (CRT) or liquid crystal display (LCD)), input component 516 (e.g., keyboard), cursor control component 518 (e.g., mouse or trackball), and image capture component 520 (e.g., analog or digital camera).
  • processing component 504 e.g., processor, micro-controller, digital signal processor (DSP), etc.
  • system memory component 506 e.g., RAM
  • computer system 500 performs specific operations by processor 504 executing one or more sequences of one or more instructions contained in system memory component 506 .
  • Such instructions may be read into system memory component 506 from another computer readable medium, such as static storage component 508 or disk drive component 510 .
  • static storage component 508 or disk drive component 510 may be another computer readable medium.
  • hard-wired circuitry may be used in place of (or in combination with) software instructions to implement the present disclosure.
  • Logic may be encoded in a computer readable medium, which may refer to any medium that participates in providing instructions to processor 504 for execution. Such a medium may take many forms, including but not limited to, non-volatile media and volatile media.
  • the computer readable medium is non-transitory.
  • non-volatile media includes optical or magnetic disks, such as disk drive component 510
  • volatile media includes dynamic memory, such as system memory component 506 .
  • data and information related to execution instructions may be transmitted to computer system 500 via a transmission media, such as in the form of acoustic or light waves, including those generated during radio wave and infrared data communications.
  • transmission media may include coaxial cables, copper wire, and fiber optics, including wires that comprise bus 502 .
  • Computer readable media includes, for example, floppy disk, flexible disk, hard disk, magnetic tape, any other magnetic medium, CD-ROM, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, RAM, PROM, EPROM, FLASH-EPROM, any other memory chip or cartridge, or any other non- transitory medium from which a computer is adapted to read.
  • execution of instruction sequences to practice the present disclosure may be performed by computer system 500 .
  • a plurality of computer systems 500 coupled by communication link 530 e.g., a communications network, such as a LAN, WLAN, PTSN, and/or various other wired or wireless networks, including telecommunications, mobile, and cellular phone networks
  • communication link 530 e.g., a communications network, such as a LAN, WLAN, PTSN, and/or various other wired or wireless networks, including telecommunications, mobile, and cellular phone networks
  • Computer system 500 may transmit and receive messages, data, information and instructions, including one or more programs (i.e., application code) through communication link 530 and communication interface 512 .
  • Received program code may be executed by processor 504 as received and/or stored in disk drive component 510 or some other non-volatile storage component for execution.
  • various embodiments provided by the present disclosure may be implemented using hardware, software, or combinations of hardware and software.
  • the various hardware components and/or software components set forth herein may be combined into composite components comprising software, hardware, and/or both without departing from the spirit of the present disclosure.
  • the various hardware components and/or software components set forth herein may be separated into sub-components comprising software, hardware, or both without departing from the scope of the present disclosure.
  • software components may be implemented as hardware components and vice-versa.
  • Software in accordance with the present disclosure, such as computer program code and/or data, may be stored on one or more computer readable mediums. It is also contemplated that software identified herein may be implemented using one or more general purpose or specific purpose computers and/or computer systems, networked and/or otherwise. Where applicable, the ordering of various steps described herein may be changed, combined into composite steps, and/or separated into sub-steps to provide features described herein.

Abstract

The present disclosure provides an exemplary system, method, and computer program product. The exemplary method includes communicating with a provider of digital content to request data regarding a specific item of interest using a mobile device associated with a user. The method further includes receiving the requested data regarding the specific item of interest from the provider of digital content on the mobile device. The method further includes rendering the received data regarding the specific item of interest on a display of the mobile device, the rendering including superimposing the received data on a street view. The method further includes navigating through the street view according to navigation functions of the mobile device to locate a physical commercial location that carries the specific item of interest.

Description

    CROSS REFERENCE TO RELATED APPLICATION
  • This application is a continuation of U.S. Ser. No. 13/534,802 filed Jun. 27, 2012, now allowed, the entire disclosure of which is incorporated herein by reference.
  • BACKGROUND
  • 1. Technical Field
  • The present disclosure generally relates to navigating through a virtual or augmented reality and, more particularly, to utilizing a mobile device to navigate through a virtual or augmented reality street view that has search results layered over the display.
  • 2. Related Art
  • Online transactions are becoming more and more prevalent, with an ever-increasing number of online entities that may or may not have a physical real world counterpart. The popularity of online transactions is partially attributable to the ease and convenience of making a transaction online instead of at a physical commercial location.
  • In addition to the on-line channel of commerce, mobile electronic communication devices and applications have been developed that are specifically aimed at commerce as well. As some non-limiting examples, these mobile devices may include smart phones, computer tablets, or laptops. Many of these mobile devices have increased capabilities as they have been equipped with, among other hardware devices, a Global Positioning System (GPS), a compass, a digital camera. These capabilities of the mobile devices have not, however, been fully utilized to create a better shopping experience for their users who may be shopping at physical commercial locations.
  • Therefore, while existing mobile devices and their applications have been generally adequate at performing their intended tasks, their fully capabilities have not been utilized in certain aspects. Accordingly, it would be advantageous to utilize capabilities of mobile devices to improve the shopping experience for users of the mobile device that are shopping at a physical commercial location by making the shopping experience easier and more convenient.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is an illustration showing a user at a physical location receiving content from a content provider, according to various aspects of the present disclosure.
  • FIG. 2 is an illustration of a mobile device, according to various aspects of the present disclosure.
  • FIG. 3A is an illustration of a side view of a mobile device, according to various aspects of the present disclosure.
  • FIG. 3B is an illustration of a top view of a mobile device, according to various aspects of the present disclosure.
  • FIG. 4A is an illustration of an example method, according to various aspects of the present disclosure.
  • FIG. 4B is an illustration of an example method, which may be performed by an application on a mobile device, according to various aspects of the present disclosure.
  • FIG. 5 is a block diagram of an example computer system suitable for implementing various methods and devices described, according to various aspects of the present disclosure.
  • DETAILED DESCRIPTION
  • It is to be understood that the following disclosure provides many different embodiments, or examples, for implementing different features of the present disclosure. Specific examples of components and arrangements are described below to simplify the present disclosure. These are, of course, merely examples and are not intended to be limiting. Various features may be arbitrarily drawn in different scales for simplicity and clarity.
  • One embodiment of the present disclosure involves a system. The system includes a display interface operable to receive an input from a user and communicate an output to the user, a transceiver operable to electronically communicate with a content provider, a computer processor operable to execute instructions, and a memory storage operable to store the instructions. The memory storage further comprising a program module that is operable to: submit a search request regarding a specific item at the content provider; receive search data regarding the specific item from the content provider; display a virtual reality street view on the display interface, the virtual reality street view having the search data regarding the specific item overlaid thereon; and navigate through the virtual reality street view by performing navigation functions.
  • Another embodiment of the present disclosure involves a method. The method includes interacting with a provider of digital content to request data regarding a specific item of interest using a mobile device associated with a user, receiving the requested data regarding the specific item of interest from the provider of digital content on the mobile device, rendering the received data regarding the specific item of interest on a display of the mobile device, the rendering including superimposing the received data on a street view, and navigating through the street view according to navigation functions of the mobile device to locate a physical commercial location that carries the specific item of interest.
  • Another embodiment of the present disclosure involves an electronic device. The electronic device includes a non-transitory, tangible computer readable storage medium storing a computer program. The computer program contains instructions that when executed perform: receiving data representative of a specific item, the data representative of a specific item received at the mobile device, which is associated with a user; based on the received data representative of the specific item, submitting a search request at a content provider to determine a physical commercial location that carries the specific item; receiving search data regarding the specific item from the content provider, the received search data regarding the specific item including data regarding a physical commercial location that carries the specific item; overlaying the search data regarding the specific item on a street view on the display of the mobile device; and navigating through the street view according to navigation functions of the mobile device.
  • As the Internet continues to evolve, users become accustomed to receiving media on-line and on their mobile devices. Modern day mobile electronic devices are often equipped with cameras, Global Positioning System (GPS) transceivers, and various kinds of sensors such as accelerometers, proximity sensors, ambient light sensors, compasses, gyroscopes, etc. These features, along with the communication and computing capabilities, make the portable electronic devices very versatile and powerful. Unfortunately, the potential of these devices have not been fully realized in a physical commercial setting (e.g., brick-and-mortar retail context). For example, the various capabilities of the portable electronic devices have not been utilized to provide a more convenient shopping experience for users of these electronic devices, nor have these capabilities been sufficiently explored to optimize a brick-and-mortar retailer's reach of customers.
  • According to the various aspects of the present disclosure, a method and apparatus are discussed below that take advantage of the various capabilities of the portable electronic devices to offer benefits to both a user and a brick-and-mortar retailer. For instance, one embodiment includes methods whereby a user of a mobile device uses the mobile device to search for a specific item. After the specific item has been found, the mobile device renders a map and/or a street view of a virtual or an augmented reality with an overlay of the physical location of the retailers that have that specific item. The user, thereafter, may use the mobile device to navigate through the street view of the virtual or the augmented reality to identify a desirable brick-and-mortar retailer that has that specific item. Another instance includes embodiments such as a provider-side and a user side applications and devices to provide and receive, respectively, the digital data including specific item data, map and/or street view data, and overlay data which the mobile device can merges to thereby allow the user to navigate and identify a desirable brick-and-mortar retailer that has that specific item.
  • Thus, various embodiments combine the digital realm with the physical realm. A user seeking to participate in a brick-and-mortar channel of commerce receives digital content on a mobile device and, thus, also participates in an on-line, mobile channel of commerce.
  • FIG. 1 represents an illustration of a system 100 according to various aspects of the present disclosure. In FIG. 1, a user 101 is at physical location 102 and has mobile device 104. The mobile device 104 has access to a network 106 which is used to communicate with one or more on-line content providers 108. The on-line content providers 108 provide digital data to the mobile device 104 of the user 101 via network 106, according to embodiments described further below.
  • Physical location 102 may be a static physical location (i.e., the user 101/mobile device 104 are not moving) or a dynamic physical location (i.e., the user 101/mobile device 104 are moving). In this example, physical location 102 is initially a static physical location such as coffee house, a hotel lobby, etc. As will be readily apparent from the disclosure below, in various embodiments, the physical location may change dynamically after a period of time. In the present example, the mobile device 104 has a network connection at the physical location 102, according to various network connection methods discussed below.
  • Mobile device 104 may include any type of mobile device configured to access digital data over a network. Examples include a notebook/laptop computer, a tablet computer (such as an iPad™ tablet computer), an MP3 player (such as an iPod™ music player), a e-book reader (such as the Kindle™ reader), a smartphone (such as the iPhone™ phone) and/or the like. Mobile device 104 includes at least one network connection operable to communicate with at least one content provider 108 over network 106. Examples of network connections include 3G/4G/LTE cellular wireless connections, 802.11 (Wi-Fi) connections to a LAN, WAN, or the Internet, a wired connection (such as by Ethernet), and/or the like. As discussed above, mobile device 104 may be equipped with cameras, Global Positioning System (GPS) transceivers, and various kinds of sensors such as accelerometers, proximity sensors, ambient light sensors, compasses, gyroscopes, etc. The mobile device 104 may further include programs and/or applications stored on computer readable medium that utilizes the above disclosed equipment for input/output (I/O) of data and to determine the physical location 102 of the user 101 and the mobile device 104.
  • User 101 may access content provided by on-line content provider 108 by, e.g., accessing content of content provider 108 through a web browser or a specialized application on mobile device 104. For example, user 101 may be seeking to purchase a specific item at a brick-and-mortar retailer nearby. Using the mobile device 104, the user 101 may gain access to the network 106 and thereafter direct a web browser of the mobile device 104 to a content provider 108 website. The content provider 108 may be any content provider such as Google, Yahoo, MSN, MILO, etc . . . Once at the content provider 108 web site, the user 101 submits a search request for the specific item sought. In response to the search request, the content provider 108 provides search result data regarding the specific item to the mobile device 104 via network 106. In alternative embodiments, rather than directing a web browser, the user 101 utilizes the application executing on the mobile device 104 to submit the search request to the content provider 108, which in response provides search result data regarding the specific item to the user's 101 mobile device 104 via network 106.
  • The user 101 may also utilize other means of searching for the specific item. In one embodiment, other means of searching for the specific item include scanning a code representative of the item searched by executing a program or an application that uses the camera of the mobile device 104. Codes representative of the item searched may include a UPC code, an EAN code, a QU code, or any other appropriate code. The scanning may be performed using the mobile device 104 executing code scanning programs and/or applications such as RedLaser®, Code Scanner®, or any appropriate code scanning programs and/or applications. Such applications may be downloaded from both Apple's ® App Store and Google's ® Android Market. After the code representative of the specific item searched has been scanned with the mobile device 104, the program and/or application contacts the content provider 108 over the network 106 and submits a search request for the specific item. In response, the content provider 108 returns search results data regarding the specific item to the mobile device 104 of the user via network 106.
  • In another embodiment, other means of searching for the specific item include visual identification of the specific item using spatial or pattern recognition. For instance, the mobile device 104 may execute a spatial or pattern recognition program and/or application, which is stored on computer readable medium of the mobile device 104. The spatial or pattern recognition program and/or application may utilize the camera of the mobile device 104 to obtain a digital image of the specific item desired by the user 101. After obtaining the digital image, the program and/or application performs spatial or pattern recognition to identify the specific item and contacts the content provider 108 over the network 106 and submits a search request for the specific item. In response, the content provider 108 returns search results data regarding the specific item to the mobile device 104 of the user via network 106. In alternative embodiments, the program and/or application submits that digital image to the content provider 108, which performs spatial or pattern recognition to identify the specific item and thereafter returns search results data regarding the specific item to the mobile device 104 of the user via network 106.
  • In some instances, content provider 108 downloads the search results data to the mobile device 104, whereas in other embodiments, content provider 108 streams the search results data to the mobile device 104. The search results data, provided by the content provider 108 to the user's 101 mobile device 104, may include data regarding the location/address of brick-and- mortar retailers that have the specific item in inventory, contact information of retailers, number of items in inventory, size in inventory, price of item, product names, ratings of the product, etc. The location of the brick-and-mortar retailer, provided in the search results, may be based on the location of the user 101 and the mobile device 104. In other words, the content provider 108 may limit the search results to brick-and-mortar retailers near the user 101/mobile device 104. As explained further below, there are various techniques for determining a user's location, and the various embodiments may use any such technique now known or later developed. The user 101 may utilize the provided search results to navigate through a virtual/augmented reality street view in order to locate a nearby brick-and-mortar retailer that has the specific item in inventory, as will be readily apparent from the disclosure set forth below.
  • It is a feature of various embodiments to make digital data available to a user based on the user's specific physical location. In one embodiment, a program and/or application on mobile device 104 and/or a program running on a computer at physical location 102 or content provider 108 verifies the location of user 101. In an example wherein mobile device 104 is enabled to the Global Positioning System (GPS) or other satellite-based location service, a GPS receiver built into mobile device 104 discerns the location of the mobile device 104. Through a computer processor, a program and/or application on mobile device 104 and/or a program at content provider 108 analyzes location information received from the GPS receiver and makes a determination of the location of the user 101/mobile device 104 to provide location based search results.
  • In a different embodiment, mobile device 104 communicates with cell towers nearby (for example through a cellular band or mode). The cell towers can be used to triangulate the position based on communication with the mobile device 104 of the user 101. In that manner, the content provider 108 may ascertain the physical location and provide location based search results.
  • In yet another embodiment, mobile device 104 is configured to connect to a network at the physical location 102, so that mobile device 104 is assigned an Internet Protocol (IP) address. The IP address may be received and analyzed by an application on mobile device 104 and/or by the content provider 108. In response to the results of the analysis, content provider 108 may ascertain the physical location of the user 101 and the mobile device 104 and provide location based search results. In still another embodiment, user 101 enters a physical location within the mobile device 104 which is used by the content provider 108 to provide location based search results. Any technique now known or later developed to discern the physical location of the user 101 and the mobile device 104 may be adapted for use in various embodiments.
  • Upon receiving the search result data regarding the specific item, in one embodiment, the user 101 utilizes a mapping program and/or application to locate nearby brick-and-mortar retailers that have the specific item in inventory. For instance, the search result data may be rendered in a map view that shows a topical area such as streets, the user 101 location, and the location of the brick-and-mortar retailers that have the specific item in stock. The mapping program and/or application may be stored in whole or in part on computer readable medium of the mobile device 104 of the user 101, and/or may be downloaded or streamed to the mobile device 104 from the content provider 108. The content provider 108 that has provided the search results data may be the same or different from the content provider that provides the mapping data. For instance, the content provider that provides the search results data may be MILO and the content provider that provides the mapping data may be Google. As another example, Google may provide both the search results data and the mapping data to the mobile device 104 of the user 101.
  • Referring to FIG. 2, after viewing the map rendering, the user may decide to switch to a street view 200 and navigate through a virtual reality that displays images of the street at or near the user's 101 current physical location 102. As will be evident from the discussion below, navigating through the virtual reality street view may be performed in a static and/or in a dynamic mode. As illustrated in FIG. 2, the street view includes an overlay of brick-and-mortar stores 202 that have the specific item in stock (e.g., Maud's Fashion, Jackie's Fashion, etc . . . ). Additional information, which was provided in the search results, may be presented to the user, while in street view. In the illustrated embodiment, for example, the number of items in stock (e.g., Viesal Jacket (3) in stock) is displayed. Further, the user may select a button 204 related to the brick-and-mortar retailer 202 to view additional information related to the specific item and/or the retailer 202. Additional information may be information such as retailer contact information (e.g., phone number/website/email), size in inventory, price of item, product names, ratings of the product, etc. As illustrated, a portion of the street view display may be configured to provide user instructions (e.g., Tilt phone to navigate, etc . . . ).
  • FIG. 3A illustrates a schematic side view of the mobile device 104 according to various aspects of the present disclosure and FIG. 3B illustrates a schematic top view of the mobile device 104 according to various aspects of the present disclosure. Referring to FIG. 3A, while in the street view, the user may navigate through the street view longitudinally according to various methods. For instance, the user may navigate forward through the street view by tilting the mobile device 104 in a forward motion 302 to angle θ1, about the x-axis. Further, the user may navigate backward through the street view by tilting the mobile device 104 in a backward motion 304 to angle θ2, about the x-axis. In certain embodiments, by tilting the mobile device 104 to a greater angle about the x-axis from a fixed angle position (e.g., zero angle position), navigating through the street view will occur at a faster pace. Conversely, by tilting the mobile device 104 by a lesser angle from the fixed angle position about the x-axis, navigation through the street view will occur at a slower pace. In certain examples, there is a hysteresis where a small angle change from the fixed angle position does not affect navigation. In further examples, the speed that the mobile device 104 has been tilted about the x-axis affects the pace of navigation through the street view. In still further examples, the duration that the mobile device 104 is tilted about the x-axis affects the pace of navigation through the street view.
  • As such, the rate or pace of forward/backward movement through the street view may be a function of the angle of tilt, speed of tilt, duration of tilt, or a combination of these factors. The tilt angle may range from about a positive 90 degrees (for forward movement) to about a negative 90 degrees (for backward movement). In the illustrated embodiment, the angle ranges from about a positive 45 degrees (for forward movement θ1) to about a negative 45 degrees (for backward movement θ2). The tilt angle, tilt speed, and tilt duration may be determined by utilizing gyroscope and clock hardware of the mobile device 104 and various programs and/or applications installed on computer readable medium and executed on the mobile device 104.
  • Referring to FIG. 3B, while in the street view, the user may navigate through the virtual reality angularly according to various methods. For instance, the user may turn right through the street view by rotating the mobile device 104 toward the right 306 to angle θ3, about the y-axis. Further, the user may turn left through the street view by rotating the mobile device 104 toward the left 308 to angel θ4,about the y-axis. In further embodiments, by rotating the mobile device 104 to a greater rotation angle from the fixed angle position (e.g., zero angle position) about the y-axis, turning through the street view will occur at a faster pace. Conversely, by rotating the mobile device 104 by a lesser rotation angle from the fixed angle position about the y-axis, turning through the street view will occur at a slower pace. In certain examples, there is a hysteresis where a small angle change from the fixed angle position does not affect navigation. In further examples, the speed that the mobile device 104 has been rotated affects the rate of turning through the street view. In still further examples, the duration that the mobile device 104 is rotated affects the rate of turning through the street view.
  • As such, the rate of turning through the street view may be a function of the angle of rotation, speed of rotation, duration of rotation, or a combination of these factors. The rotation angle may range from about a positive 90 degrees (for right turns) to about a negative 90 degrees (for left turns). In the illustrated embodiment, the angle ranges from about a positive 45 degrees (for right turns θ3) to about a negative 45 degrees (for left turns θ4). The rotation angle, rotation speed, and rotation duration may be determined by utilizing compass and clock hardware of the mobile device 104 and various programs and/or applications installed on computer readable medium and executed on the mobile device 104.
  • Referring back to FIG. 2, in further embodiments, the user may navigate through the virtual reality street view by activating hard or soft buttons for forward movement, backward movement, left turns, and right turns. For instance, as illustrated in FIG. 2, soft buttons which are illustrated as arrow keys 206 are provided on the display that activated by a user, e.g., by pressing, in order to navigate through the virtual reality street view. The arrow keys 206 point forward, backward, left, and right for navigation through the street view. In some embodiments, buttons that have another form (e.g., triangular, round, oval, square, rectangular, etc . . . ) are located at substantial similar locations and perform substantially similar functions as the arrow keys.
  • In still further embodiments, the user may navigate through the virtual reality street view by utilizing sliding motions on the display. For instance, the mobile device 104 may be configured such that a vertical sliding motion (e.g., from the bottom toward the top of the display) provides forward motion, another vertical sliding motion (e.g., from the top toward the bottom of the display) provides backward motion, a longitudinal sliding motion (e.g., from the left toward the right of the display) provides right turns, and another longitudinal sliding motion (e.g., from the right toward the left of the display) provides left turns. The sliding motions may be reversed if it is desirable such that a sliding motion from the bottom toward the top provides backward motion, a sliding motion from the top toward the bottom provides forward motion, a sliding motion from the left toward the right provides left turns, and a sliding motion from the right toward the left provides right turns.
  • In yet still further embodiments, a two-finger expand motion is utilized for forward movement and a two-finger pinch motion is utilized for backward movement. It is understood that each of the above described navigation methods are independent and may be enabled and/or utilized separately or together via software/hardware methods. For example, the tilting/rotation functions may be utilized separately without the sliding functions being enabled. Alternatively, for example, the tilting/rotation functions may be utilized along with the sliding functions to provide additional means for navigation.
  • Referring back to FIG. 1, utilizing the provided search result data overlaid on the map/street view, the user 101 can navigate through a virtual reality street view (utilizing various methods described above) to search for desirable brick-and-mortar retailers located near the user's 101 current physical location 102. Once the user 101 has located one or more brick-and-mortar retailers that have the specific item in stock at a desirable location, price, size, etc . . . , the user may elect to leave the current physical location 102 to purchase the specific item from the identified brick-and-mortar retailer.
  • Should the user 101 leave physical location 102, a dynamic mode will be entered where the mobile device 104 utilizes cellular band and modes such as 3G, 4G, LTE, or any other appropriate technique now known or later developed that allows for providing content while the user 101 and the mobile device 104 are in a transient state. While in dynamic mode, the mobile device 104 may continue to operate as previously described, or may operate in a dynamic navigation mode, depending on user selected preferences of the application and/or program. For example, where the mobile device 104 continues to operate as previously described according to selected preferences, the street view will not be updated unless the user 101 performs navigation functions as described above. On the other hand, where the mobile device 104 operates in dynamic mode according to selected preferences, the mobile device 104 will utilize physical positioning methods (as described above) to determine and update the user 101 and the mobile device 104 positions and thereafter update the navigation street view accordingly. As such, when operating in dynamic mode, the street view on the display of the mobile device 104 is continuously or periodically updated so that the street view represents the current position of the user 101. It is understood that while operating in dynamic mode, the user may still perform static mode functions for navigation through the street view, after which the street view will revert back to dynamic mode. Reverting back to dynamic mode may be initiated by the user or may be performed automatically (e.g., a timer expires since the user 101 has performed navigation functions).
  • In alternative embodiments, rather than utilizing map street view data from the content provider 108, the mobile device 104 operates in an augmented reality mode that has an overlay of the search results regarding the specific item. For example, after a search for the specific item has been performed and the search results data from the content provider 108 has been received by the mobile device 108, the user 101 may initiate an augmented reality mode by utilizing a program and/or application stored on the mobile device 104. The augmented reality program and/or application utilizes the camera of the mobile device 104 to render a current street view (e.g., as the user 101 is walking down the street). The current street view is overlaid in real-time with the search results data regarding the specific item, in a manner substantially the same as that described above with regards to the virtual reality street view that is map based. The augmented reality street view (including the overlay) is navigated/updated in real-time as the image from the camera of the mobile device 104 changes (e.g., as the user 101 continues to walk down the street). As such, a real-time augmented reality street view, having an overlay of the search results data regarding the specific item, is provided on the display of the mobile device 104.
  • The various embodiments described above provide one or more advantages. For instance, the user utilizing the above described mobile device will experience a more efficient shopping experience as only the brick-and-mortar retailers that have the specific item in stock will be considered and visited, thereby saving considerable time. Further, the user will be able to determine the retailer that has the best price for a specific item. On the other hand, the brick-and-mortar retailer may benefit by increasing its exposure to customers that would have been on- line shoppers, thereby driving more customers to its brick-and-mortar retail locations. Furthermore, the retailer may receive additional traffic within the store, thereby increasing the probability of additional transactions that may result from impulse buys.
  • FIG. 4A is an illustration of process 400 according to one embodiment. Process 400 may be performed, for example, by a human user and/or a mobile device of the user.
  • At block 402, communication occurs with a provider of digital content to request data regarding a specific item of interest on a mobile device associated with a user of the mobile device. The communication is between the user and/or the user's mobile device and the content provider. Such communicating may include providing an interface to the user and/or sending messages between the user's mobile device and the content provider to access the data of the content provider over a network. The content provider may be, for example, an on-line content provider, as discussed above. The content provider may return search results based on the user/mobile device physical location, which may be determined according the methods discussed above.
  • At block 404, the requested data regarding the specific item of interest is received from the provider of digital content on the mobile device. Receiving the requested data may include downloading the requested data to the mobile device and/or streaming the requested data to the mobile device.
  • At block 406, the received data regarding the specific item of interest is rendered on a display of the mobile device. The rendering includes superimposing the received data on a street view. The street view is based at least in part on a physical location received from an interaction with the user.
  • At block 408, the street view is navigated according to navigation functions of the mobile device. The navigation may be performed to determine a physical commercial location that has the specific item of interest. The user may perform the navigation functions with the mobile device as described above with reference to FIGS. 2-3.
  • Various embodiments include an application running on a user's mobile device that allows the user to perform the methods described above. For instance, an application may include computer-readable code running on one or more processors at the user's mobile device. The processor executes the code to perform the actions described below. FIG. 4B is an illustration of exemplary process 450, adapted according to one embodiment, which may be performed by an application on a user's mobile device.
  • At block 452, data representative of a specific item is received. The data representative of the specific item is received at the mobile device of a user. For instance, the user may provide the data representative of the specific item by manually inputting a search string, scanning a code, providing a digital image, or providing other data representative of the specific item. Furthermore, the application may provide an interface to apprise the user that the data representative of a specific item has been received and to interact with a content provider and/or other third parties.
  • At block 454, based on the received data representative of the specific item, a search request is submitted at a content provider to determine physical commercial locations that carry the specific item. Submitting the search request at the content provider may include establishing a network connection and transferring data to the content provider to initiate the search. The content provider may be, for example, an on-line content provider, as discussed above.
  • At block 456, search data regarding the specific item is received from the content provider. The received search data includes data regarding a physical commercial location that carries the specific item. The physical commercial location may carry the specific item in stock or may carry the specific item by providing special ordering. The content provider may provide search results based on the user/mobile device physical location, which may be determined according the methods discussed above.
  • At block 458, the search data regarding the specific item is overlaid on a street view on the display of the mobile device. Overlaying the search data regarding the specific item may include providing the name of one or more physical commercial locations, the number of items in stock, price per item, size in stock, etc., as described above.
  • At block 460, the street view is navigated by performing navigation functions. The user may perform the navigation functions with the mobile device as described above with reference to FIGS. 2-3.
  • It is understood that the processes 400 and 450 may include additional steps that may be performed before, during, or after actions described above. For example, before the access to the network and/or content provider is granted to the user, the user may be required to enter a correct combination of a username and a password. In some instances, the user is prompted to become a member, if the user is not already a member.
  • FIG. 5 is a block diagram of an example computer system 500 suitable for implementing various methods and devices described herein, for example, the various method blocks of the method 400 and 450. In various implementations, user devices may comprise a network communications device (e.g., mobile cellular phone, laptop, personal computer, tablet, etc.) capable of communicating with a network, and a content provider device may comprise a network computing device (e.g., a network server, a computer processor, an electronic communications interface, etc). Accordingly, it should be appreciated that each of the devices may be implemented as the computer system 500 for communication with the network in a manner as follows.
  • In accordance with various embodiments of the present disclosure, the computer system 500, such as a mobile communications device and/or a network server, includes a bus component 502 or other communication mechanisms for communicating information, which interconnects subsystems and components, such as processing component 504 (e.g., processor, micro-controller, digital signal processor (DSP), etc.), system memory component 506 (e.g., RAM), static storage component 508 (e.g., ROM), disk drive component 510 (e.g., magnetic or optical), network interface component 512 (e.g., modem or Ethernet card), display component 514 (e.g., cathode ray tube (CRT) or liquid crystal display (LCD)), input component 516 (e.g., keyboard), cursor control component 518 (e.g., mouse or trackball), and image capture component 520 (e.g., analog or digital camera). In one implementation, disk drive component 510 may comprise an array having one or more disk drive components.
  • In accordance with embodiments of the present disclosure, computer system 500 performs specific operations by processor 504 executing one or more sequences of one or more instructions contained in system memory component 506. Such instructions may be read into system memory component 506 from another computer readable medium, such as static storage component 508 or disk drive component 510. In other embodiments, hard-wired circuitry may be used in place of (or in combination with) software instructions to implement the present disclosure.
  • Logic may be encoded in a computer readable medium, which may refer to any medium that participates in providing instructions to processor 504 for execution. Such a medium may take many forms, including but not limited to, non-volatile media and volatile media. In one embodiment, the computer readable medium is non-transitory. In various implementations, non-volatile media includes optical or magnetic disks, such as disk drive component 510, and volatile media includes dynamic memory, such as system memory component 506. In one aspect, data and information related to execution instructions may be transmitted to computer system 500 via a transmission media, such as in the form of acoustic or light waves, including those generated during radio wave and infrared data communications. In various implementations, transmission media may include coaxial cables, copper wire, and fiber optics, including wires that comprise bus 502.
  • Some common forms of computer readable media includes, for example, floppy disk, flexible disk, hard disk, magnetic tape, any other magnetic medium, CD-ROM, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, RAM, PROM, EPROM, FLASH-EPROM, any other memory chip or cartridge, or any other non- transitory medium from which a computer is adapted to read.
  • In various embodiments of the present disclosure, execution of instruction sequences to practice the present disclosure may be performed by computer system 500. In various other embodiments of the present disclosure, a plurality of computer systems 500 coupled by communication link 530 (e.g., a communications network, such as a LAN, WLAN, PTSN, and/or various other wired or wireless networks, including telecommunications, mobile, and cellular phone networks) may perform instruction sequences to practice the present disclosure in coordination with one another.
  • Computer system 500 may transmit and receive messages, data, information and instructions, including one or more programs (i.e., application code) through communication link 530 and communication interface 512. Received program code may be executed by processor 504 as received and/or stored in disk drive component 510 or some other non-volatile storage component for execution.
  • Where applicable, various embodiments provided by the present disclosure may be implemented using hardware, software, or combinations of hardware and software. Also, where applicable, the various hardware components and/or software components set forth herein may be combined into composite components comprising software, hardware, and/or both without departing from the spirit of the present disclosure. Where applicable, the various hardware components and/or software components set forth herein may be separated into sub-components comprising software, hardware, or both without departing from the scope of the present disclosure. In addition, where applicable, it is contemplated that software components may be implemented as hardware components and vice-versa.
  • Software, in accordance with the present disclosure, such as computer program code and/or data, may be stored on one or more computer readable mediums. It is also contemplated that software identified herein may be implemented using one or more general purpose or specific purpose computers and/or computer systems, networked and/or otherwise. Where applicable, the ordering of various steps described herein may be changed, combined into composite steps, and/or separated into sub-steps to provide features described herein.
  • It should be appreciated that like reference numerals are used to identify like elements illustrated in one or more of the figures, wherein these labeled figures are for purposes of illustrating embodiments of the present disclosure and not for purposes of limiting the same.
  • The foregoing disclosure is not intended to limit the present disclosure to the precise forms or particular fields of use disclosed. As such, it is contemplated that various alternate embodiments and/or modifications to the present disclosure, whether explicitly described or implied herein, are possible in light of the disclosure. Having thus described embodiments of the present disclosure, persons of ordinary skill in the art will recognize that changes may be made in form and detail without departing from the scope of the present disclosure. Thus, the present disclosure is limited only by the claims.

Claims (20)

What is claimed is:
1. A system comprising:
a non-transitory memory storing instructions;
a location determination system configured to detect a location of a user device; and
one or more hardware processors coupled to the non-transitory memory and the location determination system, and configured to read instructions from the non-transitory memory to cause the system to perform operations comprising:
receiving a search result regarding a requested item;
overlaying data corresponding to the search result on a first person perspective street view, the first person perspective street view being selectable between an augmented reality mode display and a static mode display;
displaying the overlaid data in the augmented reality mode display, the augmented reality mode display updating the first person perspective street view on the display with the overlaid data based on the detected location of the user device;
switching to the static mode display in response to a user input on the user device, the static mode display updating the first person perspective street view and the overlaid data in response to a user navigation function input on the user device; and
switching back to the augmented reality mode display.
2. The system of claim 1, wherein the switching back to the augmented reality mode display further comprises:
switching back from the static mode display to the augmented reality mode display in response to expiration of a timeout period triggered by the user navigation function input.
3. The system of claim 1, wherein the switching back to the augmented reality mode display further comprises:
switching back from the static mode display to the augmented reality mode display in response to detecting a change in location of the user device by the location determination system.
4. The system of claim 1, further comprising a gyroscope, the operations further comprising:
navigating the first person perspective street view during the static mode display in response to one or more tilting operations of the user device detected by the gyroscope,
wherein a speed of the navigating is based on a combination of an angle of rotation of the user device and a speed of rotation of the user device, detected by the gyroscope, as the one or more tilting operations.
5. The system of claim 1, wherein the operations further comprise:
obtaining, by an image capture device, an image of the requested item; and
transmitting, to a content provider server, the image to be analyzed to identify the requested item,
wherein the receiving is in response to the obtaining and the transmitting.
6. The system of claim 1, wherein the location determination system comprises a global positioning system transceiver.
7. The system of claim 1, wherein the displaying further comprises:
receiving a live image feed from an image capture device; and
limiting the overlaid data to displaying in response to the overlaid data being within a viewable range of the user device according to the detected location of the user device.
8. A method comprising:
receiving a search result regarding a requested item at a mobile device;
overlaying, on a display screen of the mobile device, data corresponding to the search result on a first person perspective street view, the first person perspective street view being selectable between an augmented reality mode display and a static mode display;
displaying the overlaid data in the augmented reality mode display, the augmented reality mode display updating the first person perspective street view on the display with the overlaid data;
switching to the static mode display in response to a user input on the mobile device, the static mode display updating the first person perspective street view and the overlaid data in response to a user navigation function input on the mobile device; and
switching back to the augmented reality mode display.
9. The method of claim 8, wherein the switching back to the augmented reality mode display further comprises:
switching back from the static mode display to the augmented reality mode display in response to expiration of a timeout period triggered by the user navigation function input.
10. The method of claim 8, further comprising:
displaying, in response to the receiving the search result, the overlaid data initially in the static mode display; and
switching to the augmented reality mode display in in response to detecting a first change in location of the mobile device by a location determination system,
wherein the switching back to the augmented reality mode display further comprises switching back from the static mode display to the augmented reality mode display in response to detecting a second change in location of the mobile device by the location determination system.
11. The method of claim 10, wherein the location determination system comprises a global positioning system transceiver.
12. The method of claim 8, further comprising:
navigating the first person perspective street view during the static mode display in response to one or more tilting operations of the mobile device detected by a gyroscope,
wherein a speed of the navigating is based on a combination of an angle of rotation of the mobile device and a speed of rotation of the mobile device, detected by the gyroscope, as the one or more tilting operations.
13. The method of claim 8, further comprising:
obtaining, by an image capture device of the mobile device, an image of the requested item; and
transmitting, to a content provider server, the image to be analyzed to identify the requested item,
wherein the receiving is in response to the obtaining and the transmitting.
14. The method of claim 8, wherein the displaying further comprises:
receiving a live image feed from an image capture device of the mobile device; and
limiting the overlaid data to displaying in response to the overlaid data being within a viewable range of the user device according to a detected location of the mobile device.
15. A mobile device comprising a non-transitory, tangible computer readable storage medium having stored thereon computer-readable instructions executable to cause the mobile device to perform operations comprising:
overlaying, on a display screen of a mobile device, data corresponding to a search result regarding a requested item on a first person perspective street view, the first person perspective street view being selectable between a dynamic mode display and a static mode display;
displaying the overlaid data in the static mode display, the static mode display updating the first person perspective street view and the overlay of the data in response to a user navigation function input on the mobile device;
switching to the dynamic mode display in response to detecting a change in location of the mobile device by a location determination system; and
displaying the overlaid data in the dynamic mode display, the dynamic mode display updating the first person perspective street view on the display with the overlaid data based on a location of the mobile device detected by a location determination system of the mobile device.
16. The mobile device of claim 15, wherein the operations further comprise:
switching to the static mode display from the dynamic mode display in response to a user input on the mobile device; and
switching back to the dynamic mode display in response to expiration of a timeout period triggered by the user input.
17. The mobile device of claim 15, wherein the dynamic mode display comprises an augmented reality mode display, the operations further comprising, during the augmented reality mode display:
receiving a live image feed from an image capture device of the mobile device; and
limiting the overlaid data to displaying in response to the overlaid data being within a viewable range of the user device according to the detected location of the mobile device.
18. The mobile device of claim 15, wherein the dynamic mode display comprises content provided from a content provider server, the updating comprising updating the first person perspective street view with content obtained from the content provider server.
19. The mobile device of claim 15, the operations further comprising:
navigating the first person perspective street view during the static mode display in response to one or more tilting operations of the mobile device detected by a gyroscope,
wherein a speed of the navigating is based on a combination of an angle of rotation of the mobile device and a speed of rotation of the mobile device, detected by the gyroscope, as the one or more tilting operations.
20. The mobile device of claim 15, the operations further comprising:
obtaining, by an image capture device of the mobile device, an image of the requested item; and
transmitting, to a content provider server, the image to be analyzed to identify the requested item,
wherein the receiving is in response to the obtaining and the transmitting.
US15/211,721 2012-06-27 2016-07-15 Systems, methods, and computer program products for navigating through a virtual/augmented reality Abandoned US20160321840A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/211,721 US20160321840A1 (en) 2012-06-27 2016-07-15 Systems, methods, and computer program products for navigating through a virtual/augmented reality

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US13/534,802 US9395875B2 (en) 2012-06-27 2012-06-27 Systems, methods, and computer program products for navigating through a virtual/augmented reality
US15/211,721 US20160321840A1 (en) 2012-06-27 2016-07-15 Systems, methods, and computer program products for navigating through a virtual/augmented reality

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US13/534,802 Continuation US9395875B2 (en) 2012-06-27 2012-06-27 Systems, methods, and computer program products for navigating through a virtual/augmented reality

Publications (1)

Publication Number Publication Date
US20160321840A1 true US20160321840A1 (en) 2016-11-03

Family

ID=49779614

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/534,802 Active 2032-10-04 US9395875B2 (en) 2012-06-27 2012-06-27 Systems, methods, and computer program products for navigating through a virtual/augmented reality
US15/211,721 Abandoned US20160321840A1 (en) 2012-06-27 2016-07-15 Systems, methods, and computer program products for navigating through a virtual/augmented reality

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US13/534,802 Active 2032-10-04 US9395875B2 (en) 2012-06-27 2012-06-27 Systems, methods, and computer program products for navigating through a virtual/augmented reality

Country Status (1)

Country Link
US (2) US9395875B2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220196427A1 (en) * 2020-12-22 2022-06-23 Hyundai Motor Company Mobile Device and Vehicle

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015501984A (en) * 2011-11-21 2015-01-19 ナント ホールディングス アイピー,エルエルシー Subscription bill service, system and method
US20140046923A1 (en) 2012-08-10 2014-02-13 Microsoft Corporation Generating queries based upon data points in a spreadsheet application
US20140279275A1 (en) * 2013-03-15 2014-09-18 Autotrader.Com, Inc. Systems and methods for facilitating vehicle transactions using optical data
US9354791B2 (en) * 2013-06-20 2016-05-31 Here Global B.V. Apparatus, methods and computer programs for displaying images
US10629001B2 (en) * 2015-03-04 2020-04-21 Samsung Electronics Co., Ltd. Method for navigation in an interactive virtual tour of a property
US9983693B2 (en) * 2015-03-13 2018-05-29 Adtile Technologies Inc. Spatial motion-based user interactivity
EP3112991B1 (en) * 2015-07-01 2020-02-12 Samsung Electronics Co., Ltd. Method and apparatus for context based application grouping in virtual reality
CN108702549B (en) * 2016-03-17 2022-03-04 惠普发展公司有限责任合伙企业 Frame transmission
US10198861B2 (en) * 2016-03-31 2019-02-05 Intel Corporation User interactive controls for a priori path navigation in virtual environment
TWI596509B (en) * 2016-08-11 2017-08-21 拓景科技股份有限公司 Methods and systems for presenting specific information in a virtual reality environment, and related computer program products
US10134084B1 (en) 2017-11-17 2018-11-20 Capital One Services, Llc Augmented reality systems for facilitating a purchasing process at a merchant location
US20190266742A1 (en) * 2018-02-26 2019-08-29 Lenovo (Singapore) Pte. Ltd. Entity location provision using an augmented reality system
US10964112B2 (en) 2018-10-12 2021-03-30 Mapbox, Inc. Candidate geometry displays for augmented reality
US11461976B2 (en) * 2018-10-17 2022-10-04 Mapbox, Inc. Visualization transitions for augmented reality
US11321411B1 (en) * 2018-12-28 2022-05-03 Meta Platforms, Inc. Systems and methods for providing content
US11783289B1 (en) * 2019-03-11 2023-10-10 Blue Yonder Group, Inc. Immersive supply chain analytics using mixed reality
CA3177901C (en) * 2020-06-01 2024-01-02 Ido Merkado Systems and methods for retail environments
EP3926446A1 (en) * 2020-06-19 2021-12-22 Atos Nederland B.V. Method for moving an object in a virtual environment and device configured to implement such a method
CN112215965B (en) * 2020-09-30 2024-02-20 杭州灵伴科技有限公司 AR-based scene navigation method, device and computer-readable storage medium
US11360576B1 (en) * 2020-12-21 2022-06-14 International Business Machines Corporation Movement pattern-based mobile device user interfaces

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6351271B1 (en) * 1997-10-09 2002-02-26 Interval Research Corporation Method and apparatus for sending and receiving lightweight messages
US20090143980A1 (en) * 2005-08-17 2009-06-04 Ingrid Halters Navigation Device and Method of Scrolling Map Data Displayed On a Navigation Device
US20100030469A1 (en) * 2008-07-31 2010-02-04 Kyu-Tae Hwang Contents navigation apparatus and method thereof
US20100123737A1 (en) * 2008-11-19 2010-05-20 Apple Inc. Techniques for manipulating panoramas
US20110141141A1 (en) * 2009-12-14 2011-06-16 Nokia Corporation Method and apparatus for correlating and navigating between a live image and a prerecorded panoramic image
US20110214082A1 (en) * 2010-02-28 2011-09-01 Osterhout Group, Inc. Projection triggering through an external marker in an augmented reality eyepiece
US8381102B1 (en) * 2011-12-06 2013-02-19 Google Inc. Systems and methods for visually scrolling through a stack of items displayed on a device
US20140086151A1 (en) * 2011-05-09 2014-03-27 Telefonaktiebolaget L M Ericsson (Publ) Quality of service level adaptation for visual services in mobile communication networks

Family Cites Families (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6621422B2 (en) * 2001-10-01 2003-09-16 Advanced Public Safety, Inc. Apparatus for communicating with law enforcement during vehicle travel and associated methods
JP2005149409A (en) * 2003-11-19 2005-06-09 Canon Inc Image reproduction method and apparatus
US7720436B2 (en) * 2006-01-09 2010-05-18 Nokia Corporation Displaying network objects in mobile devices based on geolocation
US8103445B2 (en) * 2005-04-21 2012-01-24 Microsoft Corporation Dynamic map rendering as a function of a user parameter
US8700586B2 (en) * 2005-10-31 2014-04-15 Yahoo! Inc. Clickable map interface
US20150178777A1 (en) * 2008-02-05 2015-06-25 Google Inc. Informational and Advertiser Links for Use in Web Mapping Services
US20090271369A1 (en) * 2008-04-28 2009-10-29 International Business Machines Corporation Computer method and system of visual representation of external source data in a virtual environment
US9250092B2 (en) * 2008-05-12 2016-02-02 Apple Inc. Map service with network-based query for search
US20100004995A1 (en) * 2008-07-07 2010-01-07 Google Inc. Claiming Real Estate in Panoramic or 3D Mapping Environments for Advertising
JP5372157B2 (en) * 2008-09-17 2013-12-18 ノキア コーポレイション User interface for augmented reality
US8682606B2 (en) * 2008-10-07 2014-03-25 Qualcomm Incorporated Generating virtual buttons using motion sensors
SG171700A1 (en) * 2008-12-19 2011-07-28 Tele Atlas Bv Dynamically mapping images on objects in a navigation system
US8294766B2 (en) * 2009-01-28 2012-10-23 Apple Inc. Generating a three-dimensional model using a portable electronic device recording
WO2011063034A1 (en) * 2009-11-17 2011-05-26 Rtp, Llc Systems and methods for augmented reality
KR101096392B1 (en) * 2010-01-29 2011-12-22 주식회사 팬택 System and method for providing augmented reality
US8463543B2 (en) * 2010-02-05 2013-06-11 Apple Inc. Schematic maps
US20120212406A1 (en) * 2010-02-28 2012-08-23 Osterhout Group, Inc. Ar glasses with event and sensor triggered ar eyepiece command and control facility of the ar eyepiece
US8275375B2 (en) * 2010-03-25 2012-09-25 Jong Hyup Lee Data integration for wireless network systems
US20110279446A1 (en) * 2010-05-16 2011-11-17 Nokia Corporation Method and apparatus for rendering a perspective view of objects and content related thereto for location-based services on mobile device
US8640020B2 (en) * 2010-06-02 2014-01-28 Microsoft Corporation Adjustable and progressive mobile device street view
US20120032974A1 (en) * 2010-08-04 2012-02-09 Lynch Phillip C Method and apparatus for map panning recall
JP5619140B2 (en) * 2010-08-20 2014-11-05 パナソニックインテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America Reception display device, information transmission device, optical wireless communication system, reception display integrated circuit, information transmission integrated circuit, reception display program, information transmission program, and optical wireless communication method
TW201209705A (en) * 2010-08-26 2012-03-01 Hon Hai Prec Ind Co Ltd Hand-held electronic device and method for browsing an electronic map
US20120212405A1 (en) * 2010-10-07 2012-08-23 Benjamin Zeis Newhouse System and method for presenting virtual and augmented reality scenes to a user
WO2012048252A1 (en) * 2010-10-07 2012-04-12 Aria Glassworks, Inc. System and method for transitioning between interface modes in virtual and augmented reality applications
US8698843B2 (en) * 2010-11-02 2014-04-15 Google Inc. Range of focus in an augmented reality application
US20130231861A1 (en) * 2010-11-18 2013-09-05 Pioneer Corporation Terminal device, image displaying method and image displaying program executed by terminal device
US20120246003A1 (en) * 2011-03-21 2012-09-27 Hart Gregory M Advertisement Service
US8666815B1 (en) * 2011-09-09 2014-03-04 Google Inc. Navigation-based ad units in street view
US20130116922A1 (en) * 2011-11-08 2013-05-09 Hon Hai Precision Industry Co., Ltd. Emergency guiding system, server and portable device using augmented reality
US9020838B2 (en) * 2011-11-30 2015-04-28 Ncr Corporation Augmented reality for assisting consumer transactions
US20130178257A1 (en) * 2012-01-06 2013-07-11 Augaroo, Inc. System and method for interacting with virtual objects in augmented realities
CA2864003C (en) * 2012-02-23 2021-06-15 Charles D. Huston System and method for creating an environment and for sharing a location based experience in an environment
US9223496B2 (en) * 2012-05-18 2015-12-29 Ebay Inc. User interface for comparing items using gestures
US20130314398A1 (en) * 2012-05-24 2013-11-28 Infinicorp Llc Augmented reality using state plane coordinates
US10275825B2 (en) * 2012-12-04 2019-04-30 Paypal, Inc. Augmented reality in-store product detection system
US9129157B2 (en) * 2013-04-30 2015-09-08 Qualcomm Incorporated Method for image-based status determination

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6351271B1 (en) * 1997-10-09 2002-02-26 Interval Research Corporation Method and apparatus for sending and receiving lightweight messages
US20090143980A1 (en) * 2005-08-17 2009-06-04 Ingrid Halters Navigation Device and Method of Scrolling Map Data Displayed On a Navigation Device
US20100030469A1 (en) * 2008-07-31 2010-02-04 Kyu-Tae Hwang Contents navigation apparatus and method thereof
US20100123737A1 (en) * 2008-11-19 2010-05-20 Apple Inc. Techniques for manipulating panoramas
US20110141141A1 (en) * 2009-12-14 2011-06-16 Nokia Corporation Method and apparatus for correlating and navigating between a live image and a prerecorded panoramic image
US20110214082A1 (en) * 2010-02-28 2011-09-01 Osterhout Group, Inc. Projection triggering through an external marker in an augmented reality eyepiece
US20140086151A1 (en) * 2011-05-09 2014-03-27 Telefonaktiebolaget L M Ericsson (Publ) Quality of service level adaptation for visual services in mobile communication networks
US8381102B1 (en) * 2011-12-06 2013-02-19 Google Inc. Systems and methods for visually scrolling through a stack of items displayed on a device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220196427A1 (en) * 2020-12-22 2022-06-23 Hyundai Motor Company Mobile Device and Vehicle

Also Published As

Publication number Publication date
US9395875B2 (en) 2016-07-19
US20140006966A1 (en) 2014-01-02

Similar Documents

Publication Publication Date Title
US9395875B2 (en) Systems, methods, and computer program products for navigating through a virtual/augmented reality
US11227326B2 (en) Augmented reality recommendations
JP6916351B2 (en) Saving the state of the communication session
JP6605000B2 (en) Approach for 3D object display
US20180348988A1 (en) Approaches for three-dimensional object display
US10592064B2 (en) Approaches for three-dimensional object display used in content navigation
US9317113B1 (en) Gaze assisted object recognition
US9263084B1 (en) Selective sharing of body data
US9304646B2 (en) Multi-user content interactions
US9213420B2 (en) Structured lighting based content interactions
US9373025B2 (en) Structured lighting-based content interactions in multiple environments
US11455348B2 (en) Systems and methods for saving and presenting a state of a communication session
US20130254066A1 (en) Shared user experiences
US20150082145A1 (en) Approaches for three-dimensional object display
US20130254647A1 (en) Multi-application content interactions
KR102211453B1 (en) system for providing shopping information
US20120059846A1 (en) Method for retrieving object information and portable electronic device applying the same
RU2744626C2 (en) Device for location-based services
US9176539B2 (en) Key input using an active pixel camera
EP2828768A2 (en) Structured lighting-based content interactions in multiple environments
WO2013142625A2 (en) Structured lighting-based content interactions in multiple environments
KR20160142126A (en) Mobile terminal and method for controlling the same

Legal Events

Date Code Title Description
AS Assignment

Owner name: EBAY INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:GERACI, CHAD ANTHONY;REEL/FRAME:045555/0849

Effective date: 20120626

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION