GB2516459A - System and method for providing an interactive virtual model of a physical storefront - Google Patents

System and method for providing an interactive virtual model of a physical storefront Download PDF

Info

Publication number
GB2516459A
GB2516459A GB1313094.3A GB201313094A GB2516459A GB 2516459 A GB2516459 A GB 2516459A GB 201313094 A GB201313094 A GB 201313094A GB 2516459 A GB2516459 A GB 2516459A
Authority
GB
United Kingdom
Prior art keywords
user
virtual
physical
interactive
storefront
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1313094.3A
Other versions
GB201313094D0 (en
Inventor
Franco Forghieri
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to GB1313094.3A priority Critical patent/GB2516459A/en
Publication of GB201313094D0 publication Critical patent/GB201313094D0/en
Priority to PCT/IB2014/063272 priority patent/WO2015028904A1/en
Publication of GB2516459A publication Critical patent/GB2516459A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0641Shopping interfaces
    • G06Q30/0643Graphical representation of items or shoppers

Landscapes

  • Business, Economics & Management (AREA)
  • Accounting & Taxation (AREA)
  • Finance (AREA)
  • Development Economics (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Strategic Management (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

A method for providing an interactive virtual model of a physical storefront, the method comprises presenting a virtual storefront 2 within a user interactive interface of a computing device remotely located from the physical storefront; sensing a grab signal 3 of a user of the computing device indicating a virtual grabbing of a product displayed in the interactive virtual model; sensing gestures 4 of the user indicating a movement of the grabbed product and displaying the grabbed product within the user interactive interface responsive to the gestures of the user. Virtual objects can be added to the virtual store by extraction from a photo of a physical object and interaction may be by sensing a head movement, hand movement, view direction, voice control and the method may further comprise cashless payment 5 for items placed in a virtual shopping trolley. May also include interaction with a second user and avatar.

Description

System and method for providing an interaotive virtual model of a physical storefront The present applioation relates to a system and method for providing an nteracti ye vi rtual model of a phys cal store- front, in order to provide enhanced virtual shopping experi-ences.
Today's storefront shopping differs from shopping online in many respects even though often times both the storefront and the online store are operated by the same organization. Be-cause of this difference, some customers prefer to shop at storefronts while others prefer online shopping. Storefront shoppers are often hesitant to utilize online stores due to the manner in which they are required to interact. Storefront shoppers are often able to quickly locate products in a physi-cal store, but frequently have trouble finding products in an online store.
The goals of both online stores and physical stores are gener-ally the same, facilitating the purchase of goods and services by customers. In some cases, online shopping offers advantages over shopping in a physical store. For example, online stores are often open continuously, whereas most physical stores have set hours. Online shoppers are also able to leverage features, such as search functionality, while physical shoppers are not.
However, one drawback of online shopping is that the experi- ence can feel sterile and isolating. Customers in such an en-vironment may be less likely to have positive feelings about the online shopping experience, may be less inclined to engage in the online eguivalent of window shopping, and may ultimate- ly spend less money than their counterparts who shop in physi-cal stores.
Further, while online shopping, the customer does not have the opportunity to examine and handle a physical product before purchasing. If the customer does not like the physical prod-uct, once the product has been de1vered, the customer has the additional work of returning the product to the supplier.
7,685,023 Bl discloses a system and a method for virtualiz-ing a physical storefront, to present an interactive virtual model, in particular a three-dimensional model of a physical storefront to a user within a user interactive interface of a computing device remotely located from the physical store-front. Therein, at least a portion of the organizational structure of the interactive virtual model can be identical a portion of the organizational structure of the physical store-front.
However, such an interactive virtual model may not provide all of the information that the customer requires, particularly in the case that the customer does not yet know exactly which product he would like to purchase. This may be the case if the customer is not sure which model of a particular product he wishes to purchase, or about ingredients and/or expiration date of food or technical data, for example.
Therefore, methods to improve the experience of online shop-ping are desireable.
The present invention provides a method for providing an in-teractive virtual model of a physical storefront. The method comprises the following: An interactive three-dimensional vir-tual model of a physical storefront is presented within a user interactive interface of a computing device remotely located from the physical storefront. A grab signal of a user of the computing device indicating a virtual grabbing of the product is sensed. Such a grab signal may for example be a pinch grip formed by a thumb and a forefinger of a hand of the user. Fur-ther, the grab signal can he a voice control command or a movement of an input device connected to the computing device, too. As used herein, a movement of an input device of the com-puting device for example means a movement of a mouse or a joystick. Then, gestures of the user indicating a movement of the grabbed product are sensed and the grabbed product is dis-played within the user interactive interface responsive to the gestures of the user. Further, the grabbed product can be dis-played responsive to a voice control command or a movement of an input device connected to the computing device, too.
The method enables a user to visually precisely survey a prod- uct that is remote from the user, for example within a physi- cal storefront. Therein the user need not be in the same loca-tion as the product in order to obtain visual images of the product. For example, the user can get more information about the product in food trades than the information he can normal-ly see within the interactive virtual model, in particular about ingredients and/or expiration date of food, which may be cited on the bottom or a backside of the product. Also the us-er can get more information about other products, for example connectors placed on the backside of an electronic product or about technical data. By virtually grabbing the product, the user acts and has the experience as a physical shopper in a physical storefront and, therefore, the experience of online shopping can be improved. The user of online shopping acting as a physical shopper in a physical storefront has the further advantage that the user can survey products displayed in the interactive virtual model, he actually does not want to pur- chase like offers, new products or a sales discount in a phys-ical storefront and, therefore, there is the opportunity of impulse buying within online shopping.
Therein, a phys-ical object of the physical storefront can he extracted from a photo, in appearance and size, and added as a virtual object to the interactive virtual model. Further, at least a portion of the organizational structure of the physi- cal storefront can be identical a portion of the organization-al structure of the interactive virtual model, too. Such, storefront shoppers, who are often able to locate products in a physical storefront, are able to find the same products in an online store represented by the interactive virtual model, too. Therefore, the experience of online shopping can be fur- ther improved. Additionally, a collaborative shopping is per-mitted. For example, a user of the computing device interested in a particular product can give a friend shopping in the physical storefront useful information, for example about lo-cating the product in the storefront.
The sensed gestures can comprise a rotating and/or a zooming of the grabbed product. Therefore, the user can interact with the grabbed product, as if he would really hold it in the hand and, therefore, act as a physical shopper examining the prod-uct in a physical storefront. For example, the user can modify the size and/or the surface of the product he is regarding.
Therefore, the experience of online shopping can be further improved.
A price of the grabbed product can be displayed together with the grabbed product within the user interactive interface.
Therefore, useful information can be presented based on a product which has been grabbed and which the user is actually surveying.
In some embodiment, the method further oomprises the follow- -ing: head movement of the user is sensed, and a group of products or a single product the user wants to grab is deter- mined responsive to the head movement of the user. In particu-lar, the focus of the presented interactive virtual model can be targeted laid on the products, which are placed within a viewing direction of the user. For example, objects displayed within an aisle or a storage rack of the interactive three-dimensional virtual model of the storefront can be targeted selected according to the viewing direction of the user and, therefore, as if the user would be a physical shopper in a physical storefront. Therefore, the experience of online shop-ping can be further improved.
The method may further comprise the step of sensing a gesture of the user indicating a laying of a grabbed product, the user wants to purchase into a virtual shopping cart and displaying the laying of the grabbed product into the virtual shopping cart within the user interactive interface. Therefore, the us-er can interact with the user interactive interface, which can mimic the physical storefront experience, as if he would be a physical shopper in the physical storefront, in particular as if he would virtually walk through a store, shifting a shop- ping cart, and the experience of online shopping can be fur-ther improved.
Therein, the gesture of the user indioating a laying of the grabbed product the user wants to purchase into the virtual shopping cart can be a movement of a hand of the user repre-senting throwing something down. Thereby, the user of online shopping oan aot as if he would lay something into a shopping cart in a physical storefront and, therefore, get the feeling of being a physioal shopper in a physical storefront. Further, the gesture of the user indicating a laying of the grabbed product the user wants to purchase nto the v rtual shopp ng can be any other gesture, such as a headmovement of the user or a movement of an input device connected to the computing device, too.
In some embodiments, a viewing direction of the user can be monitored, and the presented interactive virtual model can be altered responsive to a change of the viewing direction.
Thereby, the orientation of aisles and shopping racks of the online shop and, therefore, the displayed products, within the interactive three-dimensional virtual model of the physical storefront can be adjusted responsive to the viewing direction of the user. Accordingly, the user gets the feeling as he is really standing within a physical storefront and looking around.
Therein, the virtual shopping cart can be displayed within the user interactive interface, if the viewing direction of the user points to the ground. Tn a physical storefront, a physi-cal shopper normally shifts a shopping cart. If the physical shopper wants to know, which products actually are within his shopping cart he is looking down into the shopping cart.
Therefore, the user of online shopping can interact with the user interactive interface as if he would shift such a shop-ping cart, too, and, therefore, as if he would be a physical shopper within the physical storefront presented in the inter-active three-dimensional virtual model.
When the virtual shopping oart is displayed within the user interactive interface, virtual images of the products laid in-to the virtual shopping oart and/or a shopping list oan be displayed. Thus, as a physical shopper in a physical store-front, the user can get information about products, actually laying within his shopping cart, by looking down, and, there- fore, the experience of online shopping can be further im-proved.
The shopping list may comprise identifications of the products laid into the virtual shopping cart and a total price of the products laid into the virtual shopping cart. Consequently, by looking into the virtual shopping cart, the user does not only see the products actually laying in the virtual shopping cart but also gets information about a total price of these prod-ucts and, therefore, the shopping experience can be enhanced.
For example, if an actual total price is larger than an avail- able budget of the user, the user can make appropriate chang- es, for example decide to virtually put products actually lay-ing in the virtual shopping cart back into the shopping racks of the interactive three-dimensional virtual model of the physical storefront.
In further embodiments, a movement of the user is sensed, and the virtual shopping cart can be moved through the interactive virtual model responsive to the sensed movement of the user.
The movement of the virtual shopping cart is displayed within the user interactive interface. Accordingly, the user gets the feeling as if he would be a physical shopper in a physical storefront, moving a shopping cart. The shopping cart may be pulled, pushed or rotated to the left or the right and, in particular, moved through the aisles of the interactive three-dimensional virtual model of a physical storefront.
Although, the method may further comprise: A command of the user is recorded. Responsive to the command of the user, the presented interactive virtual model is adapted. As pointed out, online shoppers are able to leverage features, such as search functionality, while physical shoppers are not. On the other hand, one drawback of online shopping is that the expe-rience can feel sterile and isolating. According to the method of the present invention, the advantages of online shopping and physical shopping can be combined and, therefore, the over-all shopping experience can be improved.
The command may be a gesture of the user, a voice control com-mand or a movement of an input device of the computing device.
As used herein, a movement of an input device of the computing device for example means a movement of a mouse or joystick.
Further, the command can comprise a request to display a spe-cial product.
The command may be a request to display a group of products, the user wants to visually survey, too. Thereby, the user is enabled to visually compare similar products.
Also the command can comprise an order to display another sec-tion of the physical storefront represented in the interactive virtual model.
Accordingly, the user is able to leverage features, such as search functionality, and can move from one section of the store to another, without having to virtually walk through the whole store. With the user moving from one section of the store to another or the user reguesting to display a special product, the displayed aisles and shopping racks and, there- fore, the products actually displayed within the user interac- tive interface are adapted accordingly. For example, the pre-sented interactive virtual model could be adapted to display a sect on of the st ore n wTn oh a requested product s assumed to be located. Thereby, the advantages of online shopping and physical shopping can be combined, and the over-all shopping experience can be further improved.
Further, the method may comprise a sensing of a change involv- ing at least one physical object within the physical store-front. A virtual object presented in the interactive virtual model is ohanged, responsive to sensing the change, too, so that the change to the physical object occurring in the physi-cal storefront is reflected in the interactive virtual model and is displayed in the user interactive interface. Therefore, virtual shopping and storefront shopping experience can be unified. For example, a physical storefront layout and organi-zational information can be stored within a planogram in a server. As used herein, a planogram can be a diagram or sche-matic indicating organizational information about a physical storefront. Planogram can facilitate the automated generation of a virtual storefront model. Planogram can include infor-mation including, but not limited to, aisle layout or product location, and the like. Other sources containing organization- al information about a storefront can be used alone or be com-bined with planogram usage, too. Thereby, a virtual storefront model, in particular an interactive three-dimensional virtual model of the physical storefront can be automatically generat-ed. For example, a location and identity of various in-store products can be obtained from photos, which when combined with planogram data provide an extremely realistic presentation of the stores. Therein, with dedicated software it is possible to extract an object from a photo of a physical object in a phys-ical storefront, in appearance and size, and to add it to the interactive virtual model of the physical storefront. Another advantage therefrom is that as in-store modifications are made, such as movi ng products, the changes can he automati cal -ly detected, which results in update of the virtual model of the physical storefront analog.
The user may be further able to purchase the products laid in-to the virtual shopping cart. Therefore, cashless payment of the products laid into the virtual shopping cart can be estab-lished. For example, the user's credit card can be debited for the amount of the purchase price. Further a credit transfer or an automatic debit transfer system may be used, too. In one embodiment, a password is used as a safety test in the cash-less payment process. The password may be a special gesture, voice command of the user or any other special attribute iden-tifying the user. Further, the payment process may for example be activated by a special gesture of the user or a head move-ment of the user.
In some embodiments, the purchased products are further deliv- ered to the users. In particular, according to delivery in-structions provided by the user, a delivery agent can deliver the purchased products based on delivery options specified in the delivery instructions. In one embodiment, the delivery agent can deliver the products directly to the user's door or desired physical location. Therefore, according to the present invention, for example handicapped or older persons, who are not able to visit a physical storefront, for example a food trade or a drugstore, can have improved online shopping expe-rience, wherein the advantages of online and physical shopping are combined, and finally get delivered the purchased prod-ucts.
If a plurality of users is visiting the virtual model of the phys cal storefront at the same tme, the method may further comprise: Receiving information associated with activities of a second user of the interactive virtual model of the physical storefront. A representation of activities of the second user can be displayed to the user and the user is permitted to con-tact the second user. Therefore, the user's own information may also be combined with the information of other visitors.
In particular, the user can obtain additional information by asking other users guestions about a special product and/or by exchanging experiences.
Such further visitors may be represented by avatars, rather than a more generic or uniform icon within the user interac-tive interface. Further, an avatar may represent an employee, who is available to assist the user of the computing device, who reguires assistance and reguests help.
A system for providing an interactive virtual model of a phys-ical storefront is also provided which comprises an interface server configured to present an interactive three-dimensional virtual model of a physical storefront within a user interac-tive interface of a computing device remotely located from the physical storefront to a user. The system further comprises sensing means to sense a grab signal of the user of the compu- ting device indicating a virtual grabbing of a product dis-played in the interactive virtual model, second sensing means to sense gestures of the user indicating a movement of the grabbed product, and adapting means to adapt the user interac-tive interface such that the grabbed product is displayed within the user interactive interface responsive to the ges-tures of the user. Therein, the sensing means and the second sensing means can be connected to the interface server via a In some embodiments, the computing device can be a personal computer, a tablet PC or a smartphone. However, the computing device can be any other computing electronic device with net-working capabilities to shop from remote locations, too.
In some embodiments, the system may comprise 3D glasses. 3D glasses are normally used to make a movie or a television show watched lock like a 3D scene that's happening right in front of somebody. Therefore, 3D glasses make a user feel like being part of the action. Therefore, the user can feel like walking through a physical store, however, he is not really physically present within the physical store.
In some embodiments, the system can further comprise a plural- ity of third sensing means to sense a head movement of the us-er and/or gestures of the user and/or a viewing direction of the user and/or a movement of the user and/or a command of the user. Therein, the sensing means to sense a head movement of the user can, for example, comprise attitude sensors and mo-tion sensors, to monitor the head movement. There may further be an evaluation unit, for real time or near real time evalua-tion of the sensed head movement, for example for determining a product displayed in the interactive three-dimensional vir-tual model of the physical storefront, the user wants to grab.
The means for sensing gestures of the user may, for example, comprise motion sensors and/or mobility sensor devices for biomechanical sensing of muscle contractions for making sensi-tive, accurate gesture based control of computing devices.
There may also be an evaluation unit, real time or near real time evaluating the sensed signal, for example for the deter-mination of a grabbed product, the user wants to purchase.
Further, the means for sensing a viewing direction can be es-tahi shed througTn opt cal sensors or parts of the 3D glasses.
Also the means for sensing a viewing direction of the user can comprise motion sensors and/or mobility sensor devices for biomechanical sensing of muscle contractions for making sensi-tive, accurate gesture based control of computing devices, too. There may also be an evaluation unit, for real time or near real time evaluating the sensed signals, for example to alter the presented interactive virtual model responsive to a change of the viewing direction. The means for sensing a move-ment of the user may for example be a part of the electronic device, for example a motion sensor incorporated into the electronic device. There may also be an evaluation unit real time or near real time evaluating the sensed signal, for exam-ple for moving a virtual shopping cart through the interactive virtual model responsive to the sensed movement. Further, the means for sensing a movement of the user can be an apparatus similar the grip of a physical shopping cart, too. Finally, the means for sensing a command of the user may comprise speech recognition software, mobility sensor devices for bio- mechanical sensing of muscle contractions for sensitive, accu-rate gesture based control of computing devices or means for sensing a movement of an input device of the computing device, such as a mouse or a joystick. There may also be an evaluation unit for real time or near real time evaluating the sensed commands, for example to adapt the presented interactive vir-tual model responsive to the command of the user.
In some embodiments, at least a portion of the organizational structure of the physical storefront is identical a portion of the organizational structure of the interactive virtual model and the system further comprises a plurality of fourth sensing means, each associated with a physical object of the physical storefront; a physical storefront server configured to auto-matca11y detect a 1ocaton of each of the sensors wthn the physical storefront and to associate the detected location with a location of the physical object; and a virtual store- front server configured to update a virtual location of virtu-al objects in the virtual storefront corresponding to the physical objects in accordance with the sensed location data received from the physical storefront server.
Further, the system can comprise a communication server con-figured to enable real time communications between the user and a second user of the interactive virtual model of the physical storefront and/or an avatar, thereby enabling the us-er to communicate with further users, visiting the virtual shop at the same time, or an employee.
Embodiments of the invention will now be described with refer-ence to the drawings.
Figure 1 illustrates a flow chart of a method for providing an interactive virtual model of a physical storefront.
Figure 2 illustrates the step of sensing gestures of the user indicating a movement of the grabbed product and dis- playing the grabbed product within the user interac- tive interface responsive to the gestures of the us-er.
Figure 3 illustrates further steps in a method for providing an interactive virtual model of a physical store-front.
Figure 4 1 lustrates a schematic diagram of a vi rtual sTioppi ng cart.
Figure 5 illustrates a system for providing an interactive virtual model of a physical storefront.
Figure 1 illustrates a flow chart of a method 1 for providing an interactive virtual model of a physical storefront. In this embodiment, the method 1 begins at step 2 when an interactive three-dimensional virtual model of a physical storefront is presented within a user interactive interface of a computing device remotely located from the physical storefront. Within the embodiment shown, the user can interact with the inter-face, which can mimic the storefront experience.
At step 3, a grab signal of a user of the computing device in- dicating a virtual grabbing of a product displayed in the in-teractive virtual model is sensed.
Such a grab signal may for example be a pinch grip formed by a thumb and a forefinger of a hand of the user. Further, the grab signal can be a voice control command or a movement of an input device of the computing device, too. As used herein, a movement of an input device connected to the computing device for example means a movement of a mouse or a joystick.
Then, at step 4, gestures of the user indicating a movement of the grabbed product are sensed and the grabbed product is dis-played within the user interactive interface responsive to the gestures of the user. Further, the grabbed produot can be dis-played responsive to a voice control command or a movement of an input device of the computing device, too.
The step 4 of sensing gestures of the user nd-icatng a move-ment of the grabbed product and displaying the grabbed product within the user interactive interface responsive to the ges- tures of the user, enables the user to visually precisely sur-vey a product that is remote form the user within a physical storefront. Therein, the user need not be in the same location as the product in order to obtain visual images of the prod- uct. For example, in food trades the user can get more infor-mation of the product than the information normally shown by the interactive virtual model, in particular about ingredients or about shelf life of food, which may be cited on the bottom or a backside of the product. Also the user can get more in-formation about other products, for example pins placed on the backside of an electronic product.
Further, by virtually grabbing the product, the user acts and has the experience as a physical shopper in a physical store-front and, therefore, the experience of online shopping can be improved. For example, the user can survey products, he actu-ally does not want to purchase, like offers, new products or a sales discount in a physical storefront and, therefore, there is the opportunity of impulse buying within online shopping.
Further, figure 1 illustrates the optional steps 5,6 of cash-less payment of the products the user wants to purchase and delivering the purchased products to the user.
At step 5, products, the user wants to purchase, can be cash-less paid. For example, users credit cart can be debited for the amount of the purohase prioe. Alternatively the purohased price can be paid via an automatic debit transfer system or a credit transfer. In the embodiment of figure 1, a password is used as a safety test in the cashless payment process. The password may he a special gesture, vo-ice command of the user or any other special attribute identifying the user. Further, the payment process may for example be activated by a special gesture of the user or a head movement of the user.
At step 6, the purchased products are delivered directly to the user. Therefore, a delivery agent will deliver the prod- ucts directly to users door or desired physical locations ac-cording to delivery instructions provided by the user.
Figure 2 illustrates the step 3 of sensing gestures of the us-er indicating a movement of the grabbed product and displaying the grabbed product within the user interactive interface re-sponsive to the gestures of the user.
Therein, reference numeral 10 illustrates a hand of a user making gestures, as if the user would grab and handle a prod-uct 11 displayed in the interactive three-dimensional virtual model of the physical storefront.
Reference numeral 12 illustrates a user interactive interface, on reach the grabbed product 11 is displayed responsive to the gestures of the user.
Therein, the sensed gestures may comprise a rotating and/or a zooming of the grabbed product. Therefore, the user can inter-act with the product as if he would really hold it in the hand and, therefore, act as a physical shopper in a physical store-front.
In the left part of figure 2, the grabbed product U is shown responsive to a first position 13 of the users hand 10 with a first surface 14 on top. Further, additional information, such as a price 15 of the grabbed product 11 is displayed together with the grabbed product within the user interactive interface 12. If the user now wants to change the size of the displayed product or to regard another surface of the grabbed product 11, on which, for example ingredients or expiration date of the product 11 is cited, the user can rotate his hand 10 to a second position 16. Within figure 2, the rotation of the users hand is symbolized by arrow 17.
The right part of figure 2 shows the grabbed product 11 dis-played within the user interactive interface 12 responsive to the second position 16 of the users hand 10. As can be seen, there is displayed a second surface 18 of the product on which additional information 19, for example about ingredients or expiration date of the product 11, is cited. Therefore, the user can act and has the experience as a physical shopper of a physical storefront and, therefore, the experience of online shopping can be improved.
Figure 3 illustrates further steps in a method 1 for providing an interactive virtual model of a physical storefront.
As can be seen in figure 3, there is illustrated a user 20 of a computing device 21 remotely located from a physical store-front and using an interactive three-dimensional virtual model 22 of the physical storefront.
In the embodiment shown, at least a portion of the organiza-tional structure of the physical storefront is identical a portion of the organizational struoture of the interactive virtual model 22.
Layout and organization struoture in the storefront oan be represented n a user nteractve nterfac.e 12 of the compu-ting device 21. In the embodiment of figure 3, entities and objeots within the storefront, such as shoppers 23, movable displays, not-movable displays, shopping carts 24, and the like, are represented within the interactive three-dimensional virtual model 22. For instance, a physical product in the storefront can be presented as the virtual product 11 within aisle 25 in the interactive three-dimensional virtual model 22. Therein, the user 20 can interact with the user interac-tive interface 12, which can mimic the storefront experience.
For example by moving his head, the user 20 can determine a product he wants to grab, for example the virtual product 11.
In particular, the user 20 can rotate, raise or drop his head in order to select an object displayed in the interactive three-dimensional virtual model 22 he wants to visually pre-ciously survey. Within the embodiment of figure 3, the head movement of the user 20 is symbolized by arrow 26.
Further, the user 20, which is symbolized within the interac-tive three-dimensional model 22 by shopper 23, can lay a grabbed product he wants to purchase into the virtual shopping cart 24, for example the virtual product 11, as if he would lay a product in a shopping cart within a physical storefront.
Therefore, gestures of the user 20 are sensed, indicating a laying of a grabbed product the user wants to purchase into the virtual shopping cart. In the embodiment of figure 3, the gesture of the user 20 indicating a laying of the grabbed product the user 20 wants to purchase into the virtual shop- ping cart 24 is a movement of a hand 10 of the user 20 repre-senting throwing something down. This movement of a hand 10 of the user 20 is symbolized by arrow 27 in figure 3.
Also the presented interactive virtual model 22 can he altered responsive to a change of a viewing direction of the user 20.
Thereby, the focus of the displayed part of the physical storefront can be laid on products the user 20 is actually re-garding and seems to be interested in. Within the embodiment of figure 3, the viewing direction of the user 20 is symbol-ized by arrow 28.
Further, the virtual shopping cart 24 can be moved through the interactive virtual model 22 responsive to a sensed movement of the user 20. Therefore, the user 20 can act and has the ex- perience, as if he would be a physical shopper within a physi-cal storefront, in particular moving a shopping cart through a physical storefront. The movement of the user 20 is symbolized by arrow 29 in figure 3.
Further, the user 20 may want to move from one section of the store to another. Therefore, referring to the embodiment of figure 3, a command of the user 20 is recorded and the pre-sented interactive virtual model 22 is adapted responsive to the command of the user 20. Referring to the embodiment of figure 3, the command is a voice controlled command. However, the command can be a gesture of the user 20 or a movement of an input device of the computing device 21, such as a mouse or a joystick, too. Therein, the command can comprise a reguest to display a special product or a group of products, the user wants to visually precisely survey or can comprise an order to display another section of the physical storefront. There- fore, the advantages of online shopping, such as search func-tionality, can be combined with the advantages of physical shopping and, therefore, the over-all shopping experience can be improved.
Acc.orcbng to the emhocflment shown, the user 20 can further in-teract with another user 35, symbolized by shopper 30, and, therefore, with a second user using the interaotive three-dimensional virtual model 22 of the physical storefront at the same time. Therefore, information associated with activities of a second user of the interactive virtual model 22 of the physical storefront are received, all representations of ac-tivities of the second user are displayed to the user 20 and the user 20 is permitted to contact the second user.
Therefore, the computing device 21 may contain a keyboard and a microphone as input devices, receivers for receiving real time text, video images or audio signals form a second user to conduct a conversation in real time via the network.
The interaction between the user 20 and the second user is symbolized by arrow 31 in figure 3.
Further, the user 20 may also be permitted to contact an ava-tar, for example indicating an employee, available to assist the user 20 who reguires assistance.
Further, referring to figure 3, the virtual shopping cart 24 is displayed within the user interactive interface 12 if the viewing direction of the user 20 points to the ground. The viewing direction to the user pointing to the ground is sym-bolized by arrow 32 in figure 3. Such a visualization of the virtual shopping cart 24 is illustrated referring to figure 4.
Figure 4 illustrates a schematic diagram cf a virtual shcpping cart 24.
As can be seen in figure 4, virtual images cf the prcducts laid into the virtual shopping cart 24 are displayed within the user interactive interface 12. In the embodiment shown, virtual object 11 is represented in the shopping cart 24, in-dicating that the user 20 wants to purchase the cbject II and has recently placed it 11 in his shopping cart 24.
Referring to figure 4, there is also a shopping list 33 dis-played within the user interactive interface 12. According to the embodiment shown, the shopping list 33 comprises identifi-cations of products 34 laid into the virtual shopping cart 24 and the total price 35 of the products laid into the virtual shopping cart 24. Therefore, the user 20 gets an overview of the total price of the products actually laid into his virtual shopping cart 24. For example, if an actual total price 35 is larger than an available budget of the user 20, the user 20 can make appropriate changes, for example decide to virtually put products actually laying in the virtual shopping cart 24 back into the shopping racks of the interactive three-dimensional virtual model 22 of the physical storefront.
Figure 5 illustrates a system for 40 providing an interactive virtual model of a physical storefront. In this embodiment the system 40 comprises an interface server 41 configured to pre- sent an interactive three-dimensional virtual model of a phys-ical storefront within a user interactive interface 42 of a computing device 43 remotely located from the physical store- front to a user. Therein, the interface server 41 receives da- ta from the physical storefront via a wireless or a wired net- work, for example the internet. The system 40 of figure 5 fur-they comprises sensing means 44 to sense a grab signal of the user of the computing device 43 indicating a virtual grabbing of a product displayed in the interactive virtual model, sec-ond sensing means 45 to sense gestures of the user indicating a movement of the grabbed product and adaptng means 46 to adapt the user interactive interface 42 such that the grabbed product is displayed within the user interactive interface 42 responsive to the gestures of the user. Tn the embodiment shown, the second sensing means 42 include means for biome-ohanical sensing of muscle contractions for sensitive accurate gesture based control of computing devices. Therein, the sens-ing means 44 and the second sensing means 45 can be connected to the interface server 41 via a wireless network, which is symbolized by arrow 47 in figure 5. Further, the sensing means 44 and the second sensing means 45 can be connected to the in-terface server 41 via a wired network, too.
In this embodiment, the computing device 43 is illustrated as a tablet computer 48 and is associated with the user who is able to carry mobile tablet computers. Further, the computing device 43 may be any other computing electronic device with networking capabilities to shop from remote locations, for ex-ample, a personal computer or a smartphone, too.
The system 40 further comprises 3D glasses 49. 3D glasses 49 normally make a movie or a television show watched look like a 3D scene that's happening right in front of somebody. There-fore, using the 3D glasses 49, the user can virtually walk through a store as if he would truly walk through a real store and, therefore, the experiences of online shopping can be fur-ther improved.
Figure 5 illustrates that the system 40 further comprises a plurality of third sensing means 50 to sense a head movement of the user and/or gestures of the user and/or a viewing di-rection of the user and/or a movement of the user and/or a command of the user. Theren, each of the p1ura1ty of thfrd sensing means can be a standalone device or can be integrated in the computing device.
Therein, sensing means 51 to sense a head movement of the user include an attitude sensor 52, a motion sensor 53 as well as an evaluation unit 54, for real time or near real time evalu-ating the sensed signals, for example to determine a product the user wants to grab.
The shown sensing means 55 to sense gestures of the user in-clude mobility sensor devices 56 for biomechanical sensing of muscle contractions for sensitive accurate gesture based con-trol of the computing device 43 and a second evaluation unit 57 for real time or near real time evaluating the sensed sig-nals, for example for indicating a laying of a grabbed product the user wants to purchase into the virtual shopping cart.
Further, the sensing means 58 to sense a viewing direction of the user in the embodiment shown includes an optical sensor 59 and a third evaluation unit 60, for real time or near real time evaluating the sensed signals, for example for altering the presented interactive virtual model responsive to a change of the viewing direction. Also the sensing means 58 to sense a viewing direction of the user can comprise motion sensors and/or mobility sensor devices for biomechanioal sensing of muscle contractions for making sensitive, accurate gesture based control of computing devices, too.
The shown sensing means 61 to sense a movement of the user is realized by a motion sensor 62 incorporated within the compu-ting device 43 associated to the user and a fourth evaluation unit 63, for real time or near real time evaluating the sensed signals, for example for moving the virtual shopping cart through the interactive virtual model of the physical store- front. Further, the means 61 for sensing a movement of the us-er can be an apparatus similar the grip of a physical shopping cart, too.
The shown sensing means 64 to sense a command of the user in-cludes speech recognition software 65 for evaluating voice control commands of the user in order to display a special product or to display another section of the physical store- front. Further, the command can be a movement of an input de-vice connected to the user interactive device, too. Therefore, also a joystick 70 of the computing device 43 is shown.
The system 40 of figure 5 further comprises a plurality of fourth sensing means 66, each associated with a physical ob-ject of a physical storefront, a physical storefront server 67 configured to automatically detect a location of each of the sensors 66 within the physical storefront and to associate that detected location with a location of the physical object and a virtual storefront server 68 configured to update a vir-tual location of virtual objects in the virtual storefront corresponding to the physical objects in accordance with the sensed location data received from the physical storefront server 67. Therefore, in the embodiment of figure 5, real time or near real time updates can ensure a virtual storefront, in particular an interactive virtual model and a physical store-front are synchronized.
As illustrated in figure 5, the system 40 also comprises a communication server 69 configured to enable real time commu- nications between the user and a second user of the interac-tive virtual model of the physical storefront and/or another task. Therein, other users may call an employee to assist the users who require assistance maybe represented by avatars.

Claims (27)

  1. Claims 1. A method for providing an interactive virtual model of a physical storefront, the method comprising the steps of: -present ng an nteractve three-cHmens onal v rtual model of a physical storefront within a user interac-tive interface of a computing device remotely located from the physical storefront; -sensing a grab signal of a user of the computing device indicating a virtual grabbing of a product displayed in the interactive virtual model; -sensing gestures of the user indicating a movement of the grabbed product and displaying the grabbed product within the user interactive interface responsive to the gestures of the user.
  2. 2. The method according to claim 1, wherein a physical object of the physical storefront can be extracted from a photo, in appearance and size, and added as a virtual object to the interactive virtual model.
  3. 3. The method according to claim 1 or 2, wherein the sensed gestures comprise a rotating and/or a zooming of the grabbed product.
  4. 4. The method according to one of claims 1 to 3, wherein a price of the grabbed product is displayed together with the grabbed product within the user interactive interface.
  5. 5. The method according to one of claims 1 or 4,further com-prising the steps of: -sensing a head movement of the user; -determining a product the user wants to grab responsive to the head movement of the user.
  6. 6. The method according to one of claims 1 to 5, further com- prsng the step of sensng a gesture of the user ndcat- ing a laying of a grabbed product the user wants to pur- chase into a virtual shopping cart and displaying the lay-ing of the grabbed product into the virtual shopping cart within the user interactive interface.
  7. 7. The method according to claim 6, wherein the gesture of the user indicating a laying of the grabbed product the user wants to purchase into the virtual shopping cart is a movement of a hand of the user representing throwing some-thing down and/or a headmovement of the user and/or a movement of an input device connected to the computing de-vice.
  8. 8. The method according to one of claims 1 to 7, further acm-prising the steps of: -monitoring a viewing direction of the user; -altering the presented interactive virtual model re-sponsive to a change of the viewing direction.
  9. 9. The method according to claim 8, wherein the virtual shop- ping cart is displayed within the user interactive inter-face if the viewing direction of the user points to the ground.
  10. 10. The method according to claim 9, wherein virtual images of the products laid into the virtual shopping cart and/or a shopping list are displayed within the user interactive interface.
  11. 11. The method according to claim 10, wherein the shopping list comprises identifications of the products laid into the virtual shopping cart and a total price of the prod-ucts 1ad into the virtual shopping cart.
  12. 12. The method according to one of claims 6 to 11, further comprising the steps of: -sensing a movement of the user; -moving the virtual shopping cart through the interac-tive virtual model responsive to the sensed movement of the user and displaying the movement of the virtual shopping cart within the user interactive interface.
  13. 13. The method according to one of claims 1 to 12, further comprising the steps of: -recording a command of the user; -adapting the presented interactive virtual model re-sponsive to the command of the user.
  14. 14. The method according to claim 13, wherein the command is a gesture, a voice control command or a movement of an input device connected to the computing device.
  15. 15. The method according to claim 13 or 14, wherein the com-mand comprises a reguest to display a special product.
  16. 16. The method according to claim 13 or 14, wherein the com-mand comprises a reguest to display a group of products.
  17. 17. The method according to claim 13 or 14, wherein the com-mand comprises an order to display another section of the physical storefront represented in the interactive virtual model.
  18. 18. The method according one of claims 6 tc 17, further com-pni sing the step of cashless payment of the products 1 aid into the virtual shopping cart.
  19. 19. The method according to claim 18, further comprising the step of delivering purchased products to the user.
  20. 20. The method according to one of claims 1 to 19, further comprising the steps of: -receiving information associated with activities of a second user of the interactive virtual model of the physical storefront; -displaying to the user a representation of activities of the second user; -permitting the user to contact the second user.
  21. 21. The method according to one of claims 1 to 20, further comprising the step of permitting the user to contact an avatar.
  22. 22. A system for providing an interactive virtual model of a physical storefront comprising: -an interface server configured to present an interac-tive three-dimensional virtual model of a physical storefront within a user interactive interface of a computing device remotely located from the physical storefront to a user; -sensing means to sense a grab signal of the user of the computing device indicating a virtual grabbing of a product displayed in the interactive virtual model; -second sensing means to sense gestures of the user in-dicating a movement of the grabbed product; -adapting means to adapt the user interactive interface such that the grabbed product is displayed within the user nteractve nterface respons ye to the gestures of the user.
  23. 23. The system according to claim 22, wherein the computing device is a personal computer, a tablet PC or a smartphone.
  24. 24. The system according to claim 22 or 23, further comprising 3D glasses.
  25. 25. The system according to one of claims 22 to 24, further comprising a plurality of third sensing means to sense a head movement of the user and/or gestures of the user and/or a viewing direction of the user and/or a movement of the user and/or a command of the user.
  26. 26. The system according to one of claims 20 to 23, wherein at least a portion of the organizational structure of the physical storefront is identical a portion of the organi-zational structure of the interactive virtual model, the system further comprising: -a plurality of fourth sensing means, each associated with a physical object of the physical storefront; -a physical storefront server configured to aotomatioal-ly detect a location of each of the sensors within the physical storefront and to associate the detected loca-tion with a location of the physical object.-a virtual storefront server configured to update a vir- tual location of virtual objects in the virtial store- front corresponding to the physical objects in accord-ance with the sensed location data received from the physical storefront server.
  27. 27. The system accordng to one of c1ams 22 to 26, further comprising: a communication server configured to enable real time com- munications between the user and a second user of the in-teractive virtual model of the physical storefront and/or an avatar.
GB1313094.3A 2013-07-23 2013-07-23 System and method for providing an interactive virtual model of a physical storefront Withdrawn GB2516459A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB1313094.3A GB2516459A (en) 2013-07-23 2013-07-23 System and method for providing an interactive virtual model of a physical storefront
PCT/IB2014/063272 WO2015028904A1 (en) 2013-07-23 2014-07-21 System and method for providing an interactive virtual model of a physical storefront

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1313094.3A GB2516459A (en) 2013-07-23 2013-07-23 System and method for providing an interactive virtual model of a physical storefront

Publications (2)

Publication Number Publication Date
GB201313094D0 GB201313094D0 (en) 2013-09-04
GB2516459A true GB2516459A (en) 2015-01-28

Family

ID=49119111

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1313094.3A Withdrawn GB2516459A (en) 2013-07-23 2013-07-23 System and method for providing an interactive virtual model of a physical storefront

Country Status (2)

Country Link
GB (1) GB2516459A (en)
WO (1) WO2015028904A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10586257B2 (en) 2016-06-07 2020-03-10 At&T Mobility Ii Llc Facilitation of real-time interactive feedback

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7680694B2 (en) * 2004-03-11 2010-03-16 American Express Travel Related Services Company, Inc. Method and apparatus for a user to shop online in a three dimensional virtual reality setting
US7685023B1 (en) 2008-12-24 2010-03-23 International Business Machines Corporation Method, system, and computer program product for virtualizing a physical storefront
US9256282B2 (en) * 2009-03-20 2016-02-09 Microsoft Technology Licensing, Llc Virtual object manipulation

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None *

Also Published As

Publication number Publication date
WO2015028904A1 (en) 2015-03-05
GB201313094D0 (en) 2013-09-04

Similar Documents

Publication Publication Date Title
US11403829B2 (en) Object preview in a mixed reality environment
US8606645B1 (en) Method, medium, and system for an augmented reality retail application
US11763361B2 (en) Augmented reality systems for facilitating a purchasing process at a merchant location
US10657573B2 (en) Network site tag based display of images
JP4659817B2 (en) Sales support device
US9020845B2 (en) System and method for enhanced shopping, preference, profile and survey data input and gathering
US20140365333A1 (en) Retail store customer natural-gesture interaction with animated 3d images using sensor array
US20140365336A1 (en) Virtual interactive product display with mobile device interaction
US20140363059A1 (en) Retail customer service interaction system and method
WO2017070286A1 (en) Apparatus and method for providing a virtual shopping space
US20150307279A1 (en) Retail automation platform
US20120239536A1 (en) Interactive virtual shopping experience
EP2625660A2 (en) Interactive collection book for mobile devices
JP6419702B2 (en) Equipment for assistance and convenience
WO2002056217A1 (en) Shopping system
US20170148089A1 (en) Live Dressing Room
WO2014121079A2 (en) 3d virtual store
KR102244660B1 (en) Server providing commodity recommendation service and operating method thereof
WO2023215963A1 (en) Systems and methods for interacting with augmented reality content using a dual-interface
GB2516459A (en) System and method for providing an interactive virtual model of a physical storefront
CN115082158A (en) Article display method and device, electronic equipment and storage medium
CN112700586B (en) Information presentation method and device
US20090216659A1 (en) Method and System for Assisting Cutomers in Making Purchase Decisions
Han et al. PMM: A Smart Shopping Guider Based on Mobile AR
WO2015164652A1 (en) Improved retail automation platform

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)