WO2018182092A1 - 영상 기반의 거래 방법 및 이러한 방법을 수행하는 장치 - Google Patents
영상 기반의 거래 방법 및 이러한 방법을 수행하는 장치 Download PDFInfo
- Publication number
- WO2018182092A1 WO2018182092A1 PCT/KR2017/006421 KR2017006421W WO2018182092A1 WO 2018182092 A1 WO2018182092 A1 WO 2018182092A1 KR 2017006421 W KR2017006421 W KR 2017006421W WO 2018182092 A1 WO2018182092 A1 WO 2018182092A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- image
- information
- product
- store
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/003—Navigation within 3D models or images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0641—Shopping interfaces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0613—Third-party assisted
- G06Q30/0619—Neutral agent
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0633—Lists, e.g. purchase orders, compilation or processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0641—Shopping interfaces
- G06Q30/0643—Graphical representation of items or shoppers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
Definitions
- the present invention relates to an image-based trading method and an apparatus for performing the method. More specifically, the present invention relates to a method and apparatus for performing a transactional activity without a visit of an offline store by a user using an image processing apparatus and a user apparatus.
- An omnidirectional imaging system refers to an imaging system capable of recording image information in all directions (360 degrees) based on a specific viewpoint. Compared to conventional imaging systems, images with a much wider field-of-view can be obtained. In addition to research fields such as computer vision and mobile robots, surveillance systems, virtual reality systems, and pan-tilt-zoom Applications are becoming increasingly widespread in practical applications such as cameras and video conferencing.
- an omnidirectional image may be generated by bonding an image obtained by rotating one camera about an optical axis satisfying a single view point.
- a method of arranging a plurality of cameras in an annular structure and combining images obtained from each camera may be used.
- a user may generate an omnidirectional image using various omnidirectional image processing apparatuses (or omnidirectional image processing cameras or 360 degree cameras).
- the omnidirectional imaging device may be utilized in various areas. For example, it can be used in areas requiring surveillance of omnidirectional video such as security / security, or it can be used to record the places visited by travelers when traveling.
- the omnidirectional image photographed based on the omnidirectional imaging device may be edited and used as an image for sale of a product.
- the object of the present invention is to solve all the above-mentioned problems.
- another object of the present invention is to perform a transaction for a product without a user visiting an offline store based on the store image information generated by the image processing apparatus.
- the present invention also generates a user-provided image based on a store image, generates control information based on the user-provided image, and moves the user-provided image without moving the user based on the user input information input through the user interface. Its purpose is to carry out the transactions for the necessary commodities effectively.
- the merchandise transaction service server receives the store image information from the image processing device, the merchandise transaction service server to generate user-provided information based on the store image information And transmitting, by the commodity transaction service server, the user provided information to a user device, wherein the user provided information may include user provided image information and control information.
- a commodity transaction service server for an image-based transaction may include a communication unit for data communication with an image processing device and a user device, and a processor operatively connected to the communication unit.
- the processor is configured to receive store image information from the image processing device, generate user provided information based on the store image information, and transmit the user provided information to a user device, wherein the user provided information is user provided image information. And control information.
- a transaction for a product may be performed without a user visiting an offline store based on the store image information generated by the image processing apparatus.
- a user-provided image is generated based on a store image
- control information is generated based on the user-provided image, and provided without moving the user based on user input information input through a user interface. Through the video, the necessary goods can be effectively traded.
- FIG. 1 is a conceptual diagram illustrating an image-based commodity trading system according to an embodiment of the present invention.
- FIG. 2 is a conceptual diagram illustrating an operation of an image processing apparatus according to an exemplary embodiment of the present invention.
- FIG. 3 is a conceptual diagram illustrating an operation of a commodity transaction service server according to an embodiment of the present invention.
- FIG. 4 is a conceptual diagram illustrating an operation of a commodity transaction service server according to an embodiment of the present invention.
- FIG. 5 is a conceptual diagram illustrating an operation of a commodity transaction service server according to an embodiment of the present invention.
- FIG. 6 is a conceptual diagram illustrating an operation of a commodity transaction service server according to an embodiment of the present invention.
- FIG. 7 is a conceptual diagram illustrating an operation of a commodity transaction service server according to an embodiment of the present invention.
- FIG. 8 is a conceptual diagram illustrating an operation of a commodity trading service according to an embodiment of the present invention.
- FIG. 9 is a conceptual diagram illustrating an omnidirectional image processing apparatus according to an embodiment of the present invention.
- FIG. 10 is a conceptual diagram illustrating characteristics of a plurality of image capturing units located in an omnidirectional image processing apparatus according to an exemplary embodiment of the present invention.
- FIG. 11 is a conceptual diagram illustrating image pickup lines of a plurality of image pickup units according to an exemplary embodiment of the present invention.
- FIG. 12 is a conceptual diagram illustrating image pickup lines of a plurality of image pickup units according to an exemplary embodiment of the present invention.
- the image processing apparatus of the embodiment of the present invention may include an omnidirectional image processing apparatus.
- the omnidirectional image processing apparatus may include an omnidirectional camera (360 degree camera) capable of capturing an omnidirectional (or 360 degree image).
- the product may be used to mean not only an article having an appearance but also a service product having no appearance.
- FIG. 1 is a conceptual diagram illustrating an image-based commodity trading system according to an embodiment of the present invention.
- FIG. 1 a commodity trading system for purchasing commodities based on an image without a user visiting an offline store is disclosed.
- the commodity trading system may include an image processing apparatus 100, a commodity trading service server 120, and a user device 140.
- the image processing apparatus 100 may be implemented to generate an image (eg, an omnidirectional image) of a product (or a store). For example, merchandise may be placed on a stand (or shelf) for sale of merchandise in an offline store.
- the image processing apparatus 100 may move through a moving path of an offline store, and generate an image of a store and / or a product arranged in the store.
- An image of a store generated by the image processing apparatus 100 and / or a product arranged in the store may be expressed by the term store image.
- the image processing apparatus 100 may generate a virtual image of the offline store without capturing an actual offline store.
- the virtual image for the offline store may be an image generated by virtually setting a store, a stand, a product, a moving path, etc. in a virtual space.
- the store image may be used to mean a virtual image of an offline store.
- the commodity transaction service server 120 may generate user-provided information for purchasing a product to the user device 140 of the user through processing of the store image received from the image processing apparatus.
- the user provided information may include a user provided image and control information (eg, movement control information and transaction control information) for the user's virtual movement and product purchase in the user device 140.
- the user-provided image may be an image of a store provided to the user device 140 generated based on the store image and output.
- the commodity transaction service server 120 may generate the user-provided image through image processing such as dividing the store image by moving paths in the store except for overlapping or unnecessary portions of the store image.
- the merchandise transaction service server 120 may determine a user-provided image by excluding overlapped portions of the store images generated by the image processing apparatus 100.
- the store image may be divided for each path in consideration of the location information where the store image is captured to generate a user-provided image for each path. For example, when a path is divided into a first path and a second path at a branch point, a first path user-provided image for the first path is generated based on the first store image for the first path, and the second path.
- a second route user-provided image of the second route may be generated based on the second store image of the second store image.
- the product transaction service server 120 may generate movement control information for the virtual movement of the user on the user provided image. For example, the product transaction service server 120 may determine a branch point of the in-store movement path existing in the user-provided image, and generate movement control information for outputting a user interface (move) at the branch point. The user interface (movement) may be implemented to determine the virtual movement direction of the user. In addition, the product transaction service server 120 may generate movement control information for matching the user-provided image with the input information through the user interface (movement) at the branch point.
- the merchandise transaction service server 120 launches a user interface (movement) at a branch point in the store image, receives user input information through the user interface (movement), and provides a user according to the user's virtual moving direction according to the user input information.
- Movement control information for providing an image may be generated.
- a user interface may be output based on movement control information on a user-provided image output through the user device 140.
- User input information indicating a moving direction may be input through a user interface (movement).
- the user-provided image corresponding to the movement direction indicated by the user input information based on the movement control information may be provided to the user. In this manner, the user may indicate the direction of movement through the user device 140 and virtually move in the store.
- the product transaction service server 120 may generate transaction control information for acquiring product information, product selection, and product transaction procedure of the user.
- the product transaction service server 120 may extract a product image existing in the user-provided image as object information, and determine a product corresponding to the product image.
- the commodity transaction service server 120 may generate transaction control information for matching commodity information on the determined commodity with a commodity image.
- the product transaction service server 120 may generate transaction control information for providing additional product information and processing a product.
- the user interface may be implemented to select and trade a product on a user provided image.
- the transaction control information may include information for the selection / transaction of the commodity, such as the price of the commodity, the origin of the commodity, and the trading conditions of the commodity.
- Product information may be matched with a product image in a user-provided image based on the transaction control information generated by the product transaction service server 120 and output as image information.
- additional commodity information on a commodity may be provided by a user through a user interface (transaction) and a transaction procedure may be performed.
- the product transaction service server 120 may receive the product transaction request information for the product selected by the user, and proceed with the product transaction process based on the product transaction request information.
- the user may request payment for goods purchased while virtually moving around the store through the user device 140.
- the product transaction request information may be delivered to the product transaction service server 120 when the payment request is made.
- the commodity transaction service server 120 may receive payment information (for example, card information) for payment from the user device (which may be a separate user device) 140 and proceed with a payment procedure for the product.
- the payment procedure may be performed in conjunction with a separate bank server.
- the goods completed by the goods transaction service server 120 may be delivered to the user.
- the commodity transaction service server 120 may provide a user provided image to the user device at a time, receive a user provided image request from the user device 140 according to user input information input through the user interface, and provide the requested user. You can also provide an image.
- the commodity transaction service server 120 may include a communication unit for data communication with the image processing apparatus and the user device, and may include a processor operatively connected to the communication unit. Operations of the commodity transaction service server 120 disclosed in the embodiment of the present invention may be performed based on a processor.
- the user device 140 may be implemented to purchase a product of the user.
- the user device 140 may receive user-provided information from the merchandise transaction service server 120 and output a user-provided image.
- user input information may be input through the user device 140, and a purchase procedure for a product may be performed while virtually moving in a store based on movement control information and transaction control information corresponding to the user input information.
- the user device 120 may receive user input information through a user interface (movement) and generate a virtual movement in the store of the user on the user provided image.
- the user device 120 may receive user input information through a user interface (transaction), and may select a user's product, provide information about the product, and purchase a product on the user provided image.
- the user device 120 may output an omnidirectional virtual reality (VR) / augmented reality (AR) image, and may be a device that may receive hand movement as user input information.
- the user device 120 may be a head mounted display (HMD), and may interpret and receive a movement of a user's hand as user input information.
- HMD head mounted display
- the user may virtually move in the store by selecting a direction to be moved through the user interface (movement) on the user-provided image provided through the user device 120.
- the user-provided image may output a user interface (movement) at the branch point.
- An icon (or image) indicating a direction to move included in the user interface (movement) may be selected by the user's hand.
- the user device 120 may recognize the movement of the hand as user input information, and a user-provided image corresponding to a specific direction selected by the user may be provided to the user based on the movement control information.
- the user may select and purchase a product to be purchased through a user interface (transaction) on a user provided image provided through the user device 120.
- a product to be purchased on the user-provided image may be selected by hand.
- the user device 120 recognizes the movement of the hand as user input information through a user interface (transaction), and provides information (price, origin, etc.) about the product selected by the user based on the transaction control information.
- the product purchase process can be performed.
- FIG. 2 is a conceptual diagram illustrating an operation of an image processing apparatus according to an exemplary embodiment of the present invention.
- the image processing apparatus may be an omnidirectional image processing apparatus, and the generated image may be an omnidirectional image.
- the image processing apparatus 200 may capture information about goods in a store.
- the manager or the mobile device may move the image processing apparatus 200 in the store, and the image processing apparatus may generate a store image by capturing a 360 degree image in the store.
- the image processing apparatus 200 may be a wearable 360 degree image capturing apparatus, and a manager wearing the image processing apparatus 200 may generate a store image while moving in a store. That is, the image processing apparatus 200 may move various paths of the store, and generate a store image by capturing the goods placed on the store.
- the movement path of the image processing apparatus 200 may be variously set.
- in-store map information may be input, and the image processing apparatus 200 may move according to the in-store map information.
- the in-store map information may include a movement route in a store and a location of a product in a store (or information about a product for each location in a store).
- the moving path of the image processing apparatus 200 may be determined based on the in-store map information.
- the moving path of the image processing apparatus 200 may include all paths in the store, but may be determined so that the overlapping paths are minimized.
- the moving path of the image processing apparatus 200 may be determined in consideration of the imageable distance of the image processing apparatus 200.
- the moving speed of the image processing apparatus 200 may be determined based on whether a product exists in the moving path in the store. When no product exists in the moving path, the moving speed of the image processing apparatus 200 may be relatively small, and when the product exists in the moving path, the moving speed of the image processing apparatus 200 may be relatively large. .
- Information on the moving path and information on the overlapping path of the image processing apparatus 200 may be transmitted to the commodity transaction service server.
- the commodity transaction service server may remove an image of the overlapping paths, and may segment the store image based on the information on the moving path of the image processing apparatus 200.
- the buried image generated by the image processing apparatus 200 may be a virtual image.
- the virtual image may be generated by arranging a virtual shelf in a virtual store space and placing a virtual product on the virtual shelf.
- the image processing apparatus 200 may determine the path in the virtual store space in consideration of the set virtual store space, the virtual stand, and the location of the virtual goods, and generate a virtual store image in consideration of the path in the store.
- FIG. 3 is a conceptual diagram illustrating an operation of a commodity transaction service server according to an embodiment of the present invention.
- FIG. 3 an operation of a product transaction service server for generating a user-provided image based on the received store image is started.
- the commodity transaction service server may receive information on the store image from the image processing apparatus.
- the store image may be an omnidirectional image (or 360 degree image).
- the product transaction service server may also receive location information and in-store map information where the store image is captured from the image processing apparatus.
- the merchandise transaction service server may determine an area which is unnecessarily photographed a plurality of times based on store image information, store image photographing location information, and in-store map information.
- the merchandise transaction service server may select one store image among a plurality of store images corresponding to a plurality of captured regions, or generate a store image by combining a plurality of store images.
- the image processing apparatus may move a plurality of movement paths in which fruit stalls are located among the movement paths in the store for image pickup.
- the first store image 310 is generated based on the first movement
- the second store image 320 is generated based on the second movement
- the third store image 330 is generated based on the third movement.
- the product transaction service server determines only one store image (eg, the first store image 310) of the plurality of store images of the fruit area as the user-provided image, and the remaining store image (eg, the first store image).
- the second buried image 320 and the third buried image 320 may be removed.
- one user-provided image may be generated by combining the first store image 310, the second store image 320, and the third store image 320.
- the merchandise transaction service server may segment a store image for each path within a store. For example, when the path is divided into the first path 350, the second path 360, and the third path 370 at the branch point, the merchandise transaction service server removes the store image of the first path 350. 1 to generate a second user-provided image 365 and a store image of the second path 360 as the third user-provided image 375. Can be. That is, the user-provided image for each selected path may be generated based on matching of the store image for each path.
- the merchandise transaction service server edits an area that is unnecessarily photographed a plurality of times based on store image information, store image capture location information, and in-store map information, and divides the store image by route in the store and provides user-provided image. Can be generated.
- the commodity transaction service server may generate movement control information and transaction control information based on the generated user-provided image. 4 and 5, an operation of a commodity transaction service server for generating movement control information and transaction control information based on the generated user-provided image is disclosed.
- FIG. 4 is a conceptual diagram illustrating an operation of a commodity transaction service server according to an embodiment of the present invention.
- the movement control information may be control information for providing a user-provided image corresponding to user input information input through a user interface (movement).
- the commodity transaction service server may generate movement control information for matching the user-provided image for each path.
- the user-provided image may be provided through the user device according to the input information input through the user interface (movement) based on the movement control information.
- the movement control information for matching the user-provided image corresponding to each path is based on the branch point 400. Can be generated.
- the user-provided image includes a first user-provided image 415 for the first path 410, a second user-provided image 425 for the second path 420, and a third user for the third path 430.
- the image 435 may be matched.
- the movement control information may include information for matching the user-provided image for each path.
- the first user-provided image 415 is an omnidirectional image captured when the image processing apparatus is moved through the first path 410
- the second user-provided image 425 is an image of the image processing apparatus through the second path 420.
- the image may be a omnidirectional image captured when moved
- the third user-provided image 435 may be a omnidirectional image captured when the image processing apparatus moves through the third path 430.
- Movement control information for providing a user interface (movement) on the branch point 400 may also be generated.
- the branch point 450 provided with the user interface (movement) may be set even when the path is not divided.
- a path may be set based on the branch point 450 to set movement control information. That is, the branch point may be a position set to output the user interface (movement).
- the branch point may be set at a predetermined position (distance interval) or may be set in consideration of the position of the product.
- a stand selling a first 'product eg, fruit
- a first' path left path
- a second 'path right path
- a stand selling second 'products eg, vegetables
- the third' products eg, dairy products
- a third 'path straight path
- the first 'user-provided image 465 moving to the first' path 460 is matched for specific confirmation of the first product
- the second 'path 470 for specific confirmation of the second product is matched.
- 2 'user-provided image 475 approaching and matching 3' user-provided image 485 approaching to third 'path 480 to match the third product for matching Information may be generated as movement control information.
- the commodity transaction service server may set a path name for each path, and match a user-provided image for each path to be output based on user input information. For example, when a stand displaying fruits is displayed when moving to the first path, the path name of the first path is set to fruit, and the input information of the user interface indicating the first user-provided image and the first path (left direction indicator). ) May be generated to control movement. Similarly, when a stand displaying dairy products is displayed when moving to the second route, the route name of the second route is set to dairy product, and the input information (straight direction indicator) of the user interface indicating the second user-provided image and the second route is displayed. Movement control information for matching may be generated. When the display displaying the vegetables when moving to the third route is displayed, setting the path name of the third route to vegetables and matching input information (right direction indicator) of the user interface indicating the third user-provided image with the third route Control information can be generated.
- the set path name information may be output on an input button of the user interface.
- a user who wants to move directly to a fruit stand may press an input button (fruit) of a user interface labeled “fruit” on a screen of a product purchase device.
- Input information corresponding to the input button (fruit) may be input, and a user-provided image matching the input information corresponding to the input button (fruit) may be output on the user device.
- FIG. 5 is a conceptual diagram illustrating an operation of a commodity transaction service server according to an embodiment of the present invention.
- FIG. 5 a method of generating movement control information based on a user provided image is disclosed.
- a method for generating movement control information for moving directly to a product or product category desired by a user is disclosed.
- the user provided image matching the user input information may be output.
- the user input information may be a product that a user wants to purchase and category information about the product.
- the merchandise transaction service server may generate, as movement control information, information about which merchandise is sold at which position on which path through object analysis of the user-provided image.
- the commodity transaction service server may determine that the strawberry is located on the first path and the third point, and may manage it as movement control information.
- the product transaction service server may generate, as movement control information, information on which category of products are sold at which position on which path through object analysis of the user-provided image. For example, when a product such as milk, cheese, yoghurt, etc. is located at a specific location, the product transaction service server may determine that products corresponding to a dairy category are located and sold at the corresponding location. The commodity transaction service server may determine which path and which point the dairy product category, the fruit category, etc. are located and manage as movement control information.
- movement control information information on which category of products are sold at which position on which path through object analysis of the user-provided image. For example, when a product such as milk, cheese, yoghurt, etc. is located at a specific location, the product transaction service server may determine that products corresponding to a dairy category are located and sold at the corresponding location. The commodity transaction service server may determine which path and which point the dairy product category, the fruit category, etc. are located and manage as movement control information.
- a product or a product category for example, dairy product 520
- an input product or product category for example, based on movement control information
- the user-provided image (dairy product) 540 of the position corresponding to the dairy product 520 may be immediately output.
- FIG. 6 is a conceptual diagram illustrating an operation of a commodity transaction service server according to an embodiment of the present invention.
- the transaction control information may be control information for providing a user provided image corresponding to user input information input through a user interface (transaction).
- the product transaction service server may generate transaction control information for providing product information, selecting a product, and trading a product.
- the user-provided image may be provided through the user device according to input information input through the user interface (transaction) 600 based on the transaction control information.
- the user interface (transaction) 600 may be provided when the distance between the location of the product and the virtual location determined by the virtual movement on the virtual store space of the user is less than or equal to the threshold distance.
- the image of the product may be extracted as object information on the user-provided image. For example, it may be assumed that the user-provided image includes oranges, watermelons, strawberries, and the like sold as a product.
- the commodity transaction service server may extract object image information (eg, image information of orange, watermelon, strawberry, etc.) of each individual object in the user-provided image.
- object image information eg, image information of orange, watermelon, strawberry, etc.
- the extracted object image information may be matched with product information through image analysis.
- the product transaction service server may extract object image information about the orange and determine whether a product corresponding to the object image information of the orange is.
- the reference image for a specific product for example, orange
- the commodity transaction service server may match the product information (price, origin, sale status, etc.) for the orange with the object image information of the orange.
- Product information matched with object image information of orange may be expressed as image information and output on the user-provided image.
- the commodity transaction service server may be associated with object associated information (eg, object description text (or card, sign, description) associated with an object located near the object as object information of each individual object in the user-provided image. Version), object related QR (quick response code), etc.) can be extracted.
- object associated information eg, object description text (or card, sign, description) associated with an object located near the object as object information of each individual object in the user-provided image. Version), object related QR (quick response code), etc.
- the object association information extracted by the product transaction service server may be matched with the product information through image analysis.
- the commodity transaction service server may extract image information about text or orange associated QR code, such as 'CW California per piece of Orange' located around the orange as object related information.
- the object association information may be matched with the object (or object image) nearest to the extraction position of the object association information.
- the product transaction service server may determine a product corresponding to the object association information.
- the merchandise trading service server matches product information (price, origin, sale status, etc.) for a specific product (eg, orange) corresponding to the object association information with object association information (and / or object information, object association information).
- product information matched with object association information (or object image information) of a specific product may be expressed as image information and output on the user-provided image.
- the extracted object image information and the product information may be matched by considering the location information.
- the commodity transaction service server may receive the store image capturing location information and the in-store map information from the image processing apparatus and determine whether the product in the user-provided image is.
- the in-store map information may include a movement route in the store and a location of the product in the store (or information on a product for each location in the store).
- a user-provided image is captured in a first place, and an object image of a first product is included in a first direction and an object image of a second product is included in a second direction in the user-provided image.
- What is the first product located in the first direction of the first place based on the in-store map information, and what is the second product located in the second direction of the first place based on the in-store map information Can be determined. That is, product information to be matched with the extracted object image information may be determined using only the image capturing position information and the image capturing direction information without any image analysis.
- a product eg, orange, strawberry, watermelon, etc. located at the first place may be preferentially determined. It may be determined whether there is a product (eg, strawberry) corresponding to the object image information among the located products (eg, orange, strawberry, watermelon, etc.).
- the product transaction service server may process object matching information of each individual object in the user-provided image and match the product information.
- the image processing apparatus may additionally collect location information (product location information) of goods located in a store and identification information (product identification information) of goods while collecting a store image.
- the location information of the goods located in the store and the identification information about the goods may be transmitted to the image processing apparatus by a separate communication module located in the shop or around the goods.
- the product location information may include information about a specific location of the product in the store (for example, area A, first floor, 3rd floor, 3rd column), and the product identification information may identify information (for example, For example, the identification code for the orange: 1010101).
- the product location information and the product identification information may be one grouped information.
- the product identification information may also include location information about the product.
- commodity position information and commodity identification information are disclosed as separate information for convenience of description.
- the merchandise transaction service server may identify the merchandise in the user-provided image and match the merchandise information by additionally considering the merchandise location information and merchandise identification information transmitted by the communication module, and the photographed store image information and the store map information. have.
- the merchandise transaction service server may determine in which location (eg, region A) the image of the store image photographed based on the store map information.
- the merchandise trading service server identifies the object image and merchandise location information (for example, A region, 3rd floor, 3rd column, 3rd column) and merchandise included in the store image photographed at the in-store imaging position (for example, region A).
- Information eg, identification code for orange: 1010101.
- the merchandise transaction service server may perform an image analysis on the store image to determine a product position of the object in the store image (for example, 3 columns on the 3rd floor of the first store).
- the commodity transaction service server may recognize the object A located in the 3rd column of the 3rd floor of the first table as an orange based on (product location information and product identification information) matching the product location of the object. Thereafter, the product transaction service server may provide the user with product information about the recognized object in the store.
- the method disclosed in FIG. 6 may be used alone or in combination to provide product information in a user-provided image.
- FIG. 7 is a conceptual diagram illustrating an operation of a commodity transaction service server according to an embodiment of the present invention.
- FIG. 7 a product purchase procedure of a user based on movement control information and transaction control information is disclosed.
- a procedure is disclosed for a user to perform a virtual movement in a store through a user interface (movement) and to select and trade a product through the user interface (transaction).
- the user interface (movement) and the user interface (transaction) are assumed to be user interfaces that recognize the operation of the user's hand as input information. However, various types of input information other than the operation of the user's hand may be used as the user input information.
- a user may receive a user provided image through a user device.
- the user device may be a head mounted display (HMD).
- the user device may recognize a user's hand gesture and change the user's hand gesture as input information.
- the user may virtually move in the store through the user interface (move) 700.
- a user may virtually shop for a desired product through the user interface (movement) 700 as if the user enters the store through the actual store entrance and purchases the product.
- a movement indicator eg, an arrow displayed on the user interface (movement) 700 may be selected by a user's hand, and a user-provided image according to the selected movement indicator may be provided through the user device.
- the user may move directly to the product desired to purchase through the user interface (movement) 700.
- the user may enter 'fruit' on the user interface (move) 700 or select 'fruit' through the user interface (move) 700.
- a user-provided image corresponding to the fruit sales stall may be provided to the user device.
- the user may select a product displayed on the user-provided image by moving his or her hand.
- the user interface (transaction) 750 may recognize the movement of the user's hand as input information and output product information matched with the product image. For example, when the position of the hand that performed the movement of the hand of the user picking up an object and the position of a specific product image match, the user interface (transaction) 750 recognizes the movement of the user's hand as input information. Product information matching the product image may be output.
- the user interface (transaction) 750 may receive user input information regarding whether to purchase a product from the user.
- information about the quantity to purchase may be input through the user interface (transaction) 750, and the purchased product may be stored in the user's virtual shopping cart.
- the user interface (transaction) 750 may perform a payment for the product collected in the shopping cart.
- the user inputs payment information (card number, etc.) through the user interface (transaction) 750, and the product service server may perform a transaction on the product based on the payment information.
- FIG. 8 is a conceptual diagram illustrating an operation of a commodity trading service according to an embodiment of the present invention.
- FIG. 8 a method for adaptively arranging the position of goods in a virtual store space is disclosed.
- the arrangement position of the goods may be adaptively adjusted according to the user characteristic information (information on the purchase tendency of the user / information on the purchase record of the user) 800.
- the user characteristic information 800 may include the purchase record information of the user, and the purchase record information of the user may include information about products previously purchased by the user.
- the purchase record information includes fruits and dairy products.
- the location and movement path of the store in the virtual store may be adjusted so that the fruit sale stand and the dairy sale stand may be placed at the store entrance so that the fruit sale stand and the dairy sale stand may be moved first.
- the product may display interest in the user in the virtual space. May be placed under consideration.
- the user's interest in the product may be determined in consideration of the user's previous purchase record, the user's gender, the user's age, and the like. In this way, the user's item of interest may be first shown to the user after the user's entry on the user-provided image provided to the user.
- a user interface may be separately provided for directly moving to a location of goods expected to be of interest to the user.
- a product image may be provided via a user interface (movement) so that the user can go directly to the location of a product previously purchased.
- a user-provided image corresponding to the product image may be directly provided. The user may select a product from the provided user-provided image and proceed with a purchase process.
- an image processing apparatus used to generate a store image is exemplarily disclosed.
- the image processing apparatus disclosed below is one example, and various other types of image processing apparatuses may be utilized to generate a store image.
- FIG. 9 is a conceptual diagram illustrating an omnidirectional image processing apparatus according to an embodiment of the present invention.
- FIG. 9 illustrates a structure of an omnidirectional image processing apparatus.
- the omnidirectional image processing apparatus 900 may have a shape similar to a necklace that can be worn around a user's neck in a wearable structure.
- the omnidirectional image processing apparatus 900 may be in the shape of a necklace that is open on one side as shown in FIG. 9, or may be in the shape of a necklace that is not open on one side.
- the omnidirectional image processing apparatus 900 has a U shape with one surface open.
- the U-shaped omnidirectional image processing apparatus 900 may take an omnidirectional image by being worn in the form of a wearable device on a user's neck.
- the omnidirectional image processing apparatus 900 is caught in the neck of the user in the form of a necklace (or an open necklace on one side or a U-shape).
- the omnidirectional image processing apparatus 900 may not be simply hanging on the user's neck.
- the omnidirectional image processing apparatus 900 may be installed on other body parts of the user or an external object (or object) / device / structure, etc. in various forms that can be hooked / attached to acquire an omnidirectional image.
- the user may acquire a plurality of images for generating the omnidirectional image while both hands are free with the omnidirectional image processing apparatus 900 implemented as a wearable device.
- the omnidirectional image processing apparatus 900 may include a plurality of image capturing units. Each of the plurality of image capturing units may be positioned at a specific interval (or a predetermined interval) in the omnidirectional image processing apparatus to individually capture an image according to an angle of view / image line. Positions of the plurality of image capturing units may be fixed in the omnidirectional image processing apparatus 900, but each of the plurality of image capturing units may be movable, and positions of the plurality of image capturing units may be changed.
- the omnidirectional image processing apparatus 900 may include three image capturing units, and the three image capturing units may include omnidirectional images at a constant field of view (eg, 120 degrees to 180 degrees). Can be captured.
- the three image capturing units may be an image capturing unit 1 910, an image capturing unit 2 920, and an image capturing unit 3 930.
- a structure in which three image capturing units are included in the omnidirectional image processing apparatus 900 is disclosed.
- a plurality of image pickup units for example, 2, 4, 5, 6, etc.
- image pickup units instead of three may be included in the omnidirectional image processing apparatus 900 to capture an omnidirectional image. It is included in the scope of the invention.
- the image capturing unit 1 910, the image capturing unit 2 920, and the image capturing unit 3 930 may capture an image according to an angle of view.
- Image 1 may be generated by the image capturing unit 1 910
- image 2 may be generated by the image capturing unit 2 920
- image 3 may be generated by the image capturing unit 3 930 on the same time resource.
- An angle of view of each of the image capturing unit 1 910, the image capturing unit 2 920, and the image capturing unit 3 930 may be 120 degrees or more, and the image capturing regions 1, 2, and 3 may have overlapping imaging regions.
- the omnidirectional image may be generated by stitching / correcting the images 1, 2, and 3 captured on the same time resource by the omnidirectional image processing apparatus 900.
- the stitching and / or correction procedure for the plurality of images may be performed by the omnidirectional image processing apparatus itself, or may be performed based on a user device (smart phone) that can communicate with the omnidirectional image processing apparatus 900. That is, additional image processing procedures for the generated plurality of images may be performed by the omnidirectional image processing apparatus 900 and / or another image processing apparatus (smartphone, personal computer (PC), etc.).
- a user device smart phone
- additional image processing procedures for the generated plurality of images may be performed by the omnidirectional image processing apparatus 900 and / or another image processing apparatus (smartphone, personal computer (PC), etc.).
- FIG. 10 is a conceptual diagram illustrating characteristics of a plurality of image capturing units located in an omnidirectional image processing apparatus according to an exemplary embodiment of the present invention.
- FIG. 10 features of a plurality of image capturing units located in a U-shaped omnidirectional image processing apparatus are disclosed.
- the position of the image capturing unit disclosed in FIG. 2 is exemplary.
- Each of the plurality of image capturing units may be located at various positions on the omnidirectional image processing apparatus to capture a plurality of images for generating the omnidirectional image.
- FIG. 10 In the upper portion of FIG. 10, a rear portion of the omnidirectional image processing apparatus is disclosed.
- the image capturing unit 11010 and the image capturing unit 21020 included in the omnidirectional image processing apparatus may be located at a curved portion where curvature exists in the omnidirectional image processing apparatus.
- the image capturing unit 11010 and the image capturing unit 21020 may be positioned in a curved area that is in contact with the back of the neck.
- the image capturing unit 1 1010 and the image capturing unit 21020 may be positioned at a predetermined distance with respect to the maximum curvature point of the U-shaped omnidirectional image processing apparatus (for example, the middle portion of the U-shape). Can be located.
- the image capturing unit 11010 may capture an area including a rear left rectangular area based on a line of sight direction of the user.
- the image capturing unit 2 1020 may capture an area including a rear right rectangular area based on the line of sight of the user.
- the image capturing unit 11010 may have a first angle of view and may capture an image corresponding to the first angle of view.
- the image capturing unit 2 1020 may have a second field of view, and may capture an image corresponding to the second field of view.
- the first angle of view and the second angle of view may be 120 to 180 degrees.
- a first overlapping area 1015 overlapped by the first view angle and the second view angle may occur.
- an omnidirectional image may be generated based on stitching considering the overlapping area.
- the front portion of the omnidirectional image processing apparatus is disclosed.
- the image capturing unit 3 (1030) may be located at the front of the omnidirectional image processing apparatus.
- the image capturing unit 3 1030 may be located at the distal end (the end of the U-shape) of the omnidirectional image processing apparatus.
- the distal end portion of the U-shaped omnidirectional image processing apparatus may be located in the front direction of the user (the direction of the user's gaze).
- the omnidirectional image processing apparatus may include a first end portion and a second end portion, and the image capturing portion 3 1030 may be positioned at one end portion of the first end portion and the second end portion.
- the image capturing unit 3 1030 may capture an image in the same direction as the direction of the user's line of sight, thereby capturing an area corresponding to the line of sight of the user.
- the image capturing unit 3 1030 may have a third field of view, and may capture an image corresponding to the third field of view.
- the third angle of view may be 120 degrees to 180 degrees.
- a second overlapping area 1025 may be generated by the first angle of view of the image capturing unit 1 1010 and the third angle of view of the image capturing unit 3 1030.
- a third overlapping area 1035 may occur due to a second angle of view of the image capturing unit 2 1020 and a third angle of view of the image capturing unit 31030. have.
- the image capturing unit 1 1010 and the image capturing unit 2 1020 are located at a relatively higher position than the image capturing unit 3 1030 with respect to the ground. Can be located.
- the image capturing unit 3 (1030) is located only at one end.
- the omnidirectional image processing apparatus In the conventional omnidirectional image processing apparatus, a plurality of image capturing units positioned at the same height are implemented with a constant angle, whereas the omnidirectional image processing apparatus according to an embodiment of the present invention has different angles between the plurality of image capturing units, and the heights of the image capturing units are different from each other. can be different. Accordingly, the size / shape of the first overlapping region 1015, the second overlapping region 1025, and the third overlapping region 1035 may be different with respect to the plurality of images generated by each of the plurality of image capturing units.
- An omnidirectional image may be generated based on image processing procedures (stitching / correction, etc.) for the images 1, 2, and 3 generated by each.
- the size of the first view angle, the second view angle, and the third view angle may be set to be the same, but may be set differently, and such embodiments are also included in the scope of the present invention.
- FIG. 11 is a conceptual diagram illustrating image pickup lines of a plurality of image pickup units according to an exemplary embodiment of the present invention.
- imaging lines of the plurality of image capturing units provided in the omnidirectional image processing apparatus are disclosed. Assuming that the ground is parallel to the XZ plane formed by the X and Z axes, the imaging line is a lens of each of the plurality of image capturing units included in the omnidirectional image processing apparatus in a space represented by the X, Y, and Z axes. It can be defined as a line vertically penetrating the center of the.
- the existing omnidirectional image processing apparatus may implement a plurality of image capturing units at a constant angle (for example, 120 degrees) at the same height.
- the plurality of imaging lines of the plurality of image capturing units included in the conventional omnidirectional image processing apparatus are parallel to the ground (or the XZ plane) and have a certain angle (for example, 120 degrees) between the plurality of imaging lines. It can be the line of.
- the height of the plurality of image capturing units (or implemented positions of the plurality of image capturing units) and the angle between the plurality of image capturing units (or angles formed between the imaging lines) May differ from each other at the time of imaging. Therefore, the characteristic of the image pickup line of the omnidirectional image processing apparatus according to the embodiment of the present invention is different from that of the image pickup line of the conventional omnidirectional image processing apparatus.
- the imaging lines of the plurality of image capturing units represented in FIG. 11 may be examples for showing a difference in characteristics (eg, height and angle) between the imaging lines of each of the plurality of image capturing units due to the characteristics of the wearable device. have.
- the imaging line represented in FIG. 3 may be an imaging line when there is no movement by a user wearing the omnidirectional image processing apparatus or when the omnidirectional image processing apparatus is fixed in a specific state.
- FIG 11 illustrates the image pickup lines of the image capture unit 1 1110 and the image capture unit 2 1120.
- the image capturing unit 1110 and the image capturing unit 2120 may be implemented at positions relatively higher than those of the image capturing unit 1130.
- the standing direction of the user wearing the omnidirectional image processing apparatus is assumed to be the Y-axis direction
- the image capturing unit 1110 and the image capturing unit 2120 in the omnidirectional image capturing apparatus are structured in the neck of the wearable device.
- the portion having the located curvature (curve / center portion in the U-shape) may be relatively raised, and the leg portion (the end portion in the U-shape) in which the image capturing unit 3 1130 is located may be relatively downward.
- the imaging line 1 1115 of the image capturing unit 1110 may be parallel to the XZ plane, but the X axis and the first angle 1, the Y axis and the second angle, and the Z axis and the third axis at the coordinate a of the Y axis. It can have an angle.
- the imaging line 2 1125 of the image capturing unit 2 1120 may be parallel to the XZ plane, and may have an X axis and a fourth angle, a Y axis and a fifth angle, and a Z axis and a sixth angle at the point a of the Y axis. .
- the imaging line 3 (1135) of the image capturing unit 3 (1130) is parallel to the XZ plane, and the X axis and the seventh angle, the Y axis and the eighth angle, and the Z axis at the coordinate b of the Y axis. And the ninth angle. b may be less than a.
- the imaging line 3 1135 of the image capturing unit 3 1130 may be implemented to face the front surface (eg, a direction perpendicular to the XY plane) parallel to the XZ plane and the same as the line of sight of the user.
- the imaging line 1 1115 and the imaging line 2 1125 have the same height with respect to the Y axis, and the imaging line 3 1135 is positioned at a position relatively lower than the imaging line 1 and the imaging line 2 with respect to the Y axis. can do.
- the imaging line 1 1115, the imaging line 2 1125, and the imaging line 3 1135 disclosed in FIG. 11 are one example of an imaging line having different characteristics, and various other imaging lines are defined and an omnidirectional image is defined. Can be imaged.
- FIG. 12 is a conceptual diagram illustrating image pickup lines of a plurality of image pickup units according to an exemplary embodiment of the present invention.
- FIG. 12 imaging lines of a plurality of image capturing units different from FIG. 11 are disclosed. Likewise, in FIG. 12, it is assumed that the ground is parallel to the XZ plane formed by the X and Z axes.
- FIG. 12 shows the imaging lines of the image capturing unit 1 1210 and the image capturing unit 21220.
- the image capturing unit 11210 and the image capturing unit 21220 may be implemented at positions relatively higher than the image capturing unit 31230.
- the omnidirectional image capturing apparatus has a curvature in which the image capturing unit 11210 and the image capturing unit 21220 are positioned.
- the curved portion of the U-shape) is relatively raised, and the leg portion (the distal portion of the U-shape) in which the image capturing unit 3230 is located may be relatively downward.
- the imaging line 11215 of the image capturing unit 11210 is parallel to the XZ plane, and has an X axis and a first angle 1, a Y axis and a second angle, and a Z axis and a third axis at a coordinate a of the Y axis. It can have an angle.
- the imaging line 21215 of the image capturing unit 21220 may be parallel to the XZ plane and have an X axis and a fourth angle, a Y axis and a fifth angle, and a Z axis and a sixth angle at the coordinate a of the Y axis. .
- the imaging line 31235 of the image capturing unit 3130 may not be parallel to the XZ plane, and the X axis and the seventh angle, the Y axis and the eighth angle, and the Z axis, with the coordinate b of the Y axis as a starting point. It may have a ninth angle.
- the imaging line may not be parallel to the XZ plane, but may have a constant angle (for example, 0 to 30 degrees) with the XZ plane.
- the imaging line 11215 and the imaging line 21225 have the same height with respect to the Y axis, and the imaging line 31235 is more than the imaging line 11215 and the imaging line 21225 with respect to the Y axis. It may be located at a relatively low position.
- the imaging line 1 1215 and the imaging line 21225 are parallel to the XZ plane, the imaging line 31235 may not be parallel to the XZ plane.
- the imaging line 1 of the image capturing unit 1 forms a first 'angle with the XZ plane and the coordinate a of the Y axis is the starting point. It may have a second angle, a Z axis and a third angle.
- the imaging line 2 of the image capturing unit 2 forms a first 'angle with the XZ plane and sets the X and fourth angles, the Y and fifth angles, and the Z and sixth angles with the coordinate a of the Y axis as a starting point.
- the imaging line 3 of the image capturing unit 3 may have a second angle with the XZ plane, and may have the X axis, the seventh angle, the Y axis and the eighth angle, and the Z axis and the ninth angles with the coordinate b of the Y axis as a starting point. have.
- the imaging line 1 of the image capturing unit 1 forms the first 'angle with the XZ plane and the coordinate a of the Y axis is the starting point. And a second angle, a Z axis, and a third angle.
- the imaging line 2 of the image capturing unit 2 forms the second angle with the XZ plane and sets the X axis and the fourth angle, the Y axis and the fifth angle, and the Z axis and the sixth angle with the coordinate a of the Y axis as a starting point.
- the imaging line 3 of the image capturing unit 3 may have a third angle with the XZ plane, and may have an X axis and a seventh angle, a Y axis and an eighth angle, and a Z and ninth angles with the coordinate b of the Y axis as a starting point. have.
- the omnidirectional image processing apparatus uses a plurality of image capturing units.
- the imaging line may be located at different Y-axis points and have different angles from the ground (or XZ plane).
- Embodiments according to the present invention described above can be implemented in the form of program instructions that can be executed by various computer components and recorded in a computer-readable recording medium.
- the computer-readable recording medium may include program instructions, data files, data structures, etc. alone or in combination.
- Program instructions recorded on the computer-readable recording medium may be specially designed and configured for the present invention, or may be known and available to those skilled in the computer software arts.
- Examples of computer-readable recording media include magnetic media such as hard disks, floppy disks and magnetic tape, optical recording media such as CD-ROMs and DVDs, and magneto-optical media such as floptical disks. medium) and hardware devices specifically configured to store and execute program instructions, such as ROM, RAM, flash memory, and the like.
- Examples of program instructions include not only machine code generated by a compiler, but also high-level language code that can be executed by a computer using an interpreter or the like.
- the hardware device may be modified with one or more software modules to perform the processing according to the present invention, and vice versa.
Landscapes
- Business, Economics & Management (AREA)
- Engineering & Computer Science (AREA)
- Accounting & Taxation (AREA)
- Finance (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Marketing (AREA)
- Strategic Management (AREA)
- Economics (AREA)
- Development Economics (AREA)
- General Business, Economics & Management (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Human Computer Interaction (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
Claims (11)
- 영상 기반의 거래 방법은,상품 거래 서비스 서버가 영상 처리 장치로부터 매장 영상 정보를 수신하는 단계;상기 상품 거래 서비스 서버가 상기 매장 영상 정보를 기반으로 사용자 제공 정보를 생성하는 단계; 및상기 상품 거래 서비스 서버가 상기 사용자 제공 정보를 사용자 장치로 전송하는 단계를 포함하되,상기 사용자 제공 정보는 사용자 제공 영상 정보 및 제어 정보를 포함하는 것을 특징으로 하는 방법.
- 제1항에 있어서,상기 사용자 제공 영상 정보는 상기 매장 영상 정보에 대한 후처리를 통해 생성되고,상기 제어 정보는 이동 제어 정보 및 거래 제어 정보를 포함하고,상기 이동 제어 정보는 상기 사용자 제공 영상 정보를 기반으로 출력되는 가상 매장 공간 상에서 사용자의 가상 이동을 위한 제어 정보를 포함하고,상기 거래 제어 정보는 상기 가상 매장 공간 상에 위치한 상품에 대한 거래를 위한 제어 정보를 포함하는 것을 특징으로 하는 방법.
- 제2항에 있어서,상기 이동 제어 정보는 상기 사용자 장치 상에 사용자 인터페이스(이동)을 출력하고, 상기 사용자 인터페이스(이동)로 입력되는 사용자 입력 정보에 대응되는 사용자 제공 영상 정보를 출력하고,상기 거래 제어 정보는 상기 사용자 장치 상에 사용자 인터페이스(거래)를 출력하고, 상기 사용자 인터페이스(거래)로 입력되는 사용자 입력 정보에 대응되는 상품 정보를 제공하는 것을 특징으로 하는 방법.
- 제3항에 있어서,상기 사용자 인터페이스(이동)은 설정된 분기점 상에서 출력되고,상기 사용자 인터페이스(거래)는 상기 상품의 위치와 상기 가상 이동으로 결정된 가상 위치 간의 거리가 임계 거리 이하인 경우 제공되는 것을 특징으로 하는 방법.
- 제4항에 있어서,상기 매장 영상 정보는 상기 영상 처리 장치에 의해 촬상된 매장에 대한 전방향 영상이고,상기 사용자 장치는 상기 전방향 영상에 대한 출력이 가능하고, 사용자의 손의 움직임을 상기 사용자 입력 정보로서 인식하는 것을 특징으로 하는 방법.
- 영상 기반의 거래를 위한 상품 거래 서비스 서버는,영상 처리 장치 및 사용자 장치와 데이터 통신을 위한 통신부; 및상기 통신부와 동작 가능하게(operatively) 연결된 프로세서를 포함하되,상기 프로세서는 상기 영상 처리 장치로부터 매장 영상 정보를 수신하고,상기 매장 영상 정보를 기반으로 사용자 제공 정보를 생성하고,상기 사용자 제공 정보를 사용자 장치로 전송하도록 구현되되,상기 사용자 제공 정보는 사용자 제공 영상 정보 및 제어 정보를 포함하는 것을 특징으로 하는 상품 거래 서비스 서버.
- 제6항에 있어서,상기 사용자 제공 영상 정보는 상기 매장 영상 정보에 대한 후처리를 통해 생성되고,상기 제어 정보는 이동 제어 정보 및 거래 제어 정보를 포함하고,상기 이동 제어 정보는 상기 사용자 제공 영상 정보를 기반으로 출력되는 가상 매장 공간 상에서 사용자의 가상 이동을 위한 제어 정보를 포함하고,상기 거래 제어 정보는 상기 가상 매장 공간 상에 위치한 상품에 대한 거래를 위한 제어 정보를 포함하는 것을 특징으로 하는 상품 거래 서비스 서버.
- 제7항에 있어서,상기 이동 제어 정보는 상기 사용자 장치 상에 사용자 인터페이스(이동)을 출력하고, 상기 사용자 인터페이스(이동)로 입력되는 사용자 입력 정보에 대응되는 사용자 제공 영상 정보를 출력하고,상기 거래 제어 정보는 상기 사용자 장치 상에 사용자 인터페이스(거래)를 출력하고, 상기 사용자 인터페이스(거래)로 입력되는 사용자 입력 정보에 대응되는 상품 정보를 제공하는 것을 특징으로 하는 상품 거래 서비스 서버.
- 제8항에 있어서,상기 사용자 인터페이스(이동)은 설정된 분기점 상에서 출력되고,상기 사용자 인터페이스(거래)는 상기 상품의 위치와 상기 가상 이동으로 결정된 가상 위치 간의 거리가 임계 거리 이하인 경우 제공되는 것을 특징으로 하는 상품 거래 서비스 서버.
- 제9항에 있어서,상기 매장 영상 정보는 상기 영상 처리 장치에 의해 촬상된 매장에 대한 전방향 영상이고,상기 사용자 장치는 상기 전방향 영상에 대한 출력이 가능하고, 사용자의 손의 움직임을 상기 사용자 입력 정보로서 인식하는 것을 특징으로 하는 상품 거래 서비스 서버.
- 제1항 내지 제5항 중 어느 한 항에 따른 방법을 실행하기 위한 컴퓨터 프로그램을 기록한 컴퓨터 판독 가능한 기록 매체.
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019551592A JP2020511725A (ja) | 2017-03-31 | 2017-06-19 | 画像基盤の取引方法およびこのような方法を遂行する装置 |
US16/497,101 US20210118229A1 (en) | 2017-03-31 | 2017-06-19 | Image-based transaction method and device for performing method |
EP17903250.3A EP3605428A4 (en) | 2017-03-31 | 2017-06-19 | IMAGE-BASED TRANSACTION METHOD AND DEVICE FOR IMPLEMENTING THE METHOD |
CN201780088845.2A CN110462666A (zh) | 2017-03-31 | 2017-06-19 | 基于图像的交易方法及执行这种方法的装置 |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR20170041324 | 2017-03-31 | ||
KR10-2017-0041324 | 2017-03-31 | ||
KR10-2017-0045614 | 2017-04-08 | ||
KR1020170045614A KR101843335B1 (ko) | 2017-03-31 | 2017-04-08 | 영상 기반의 거래 방법 및 이러한 방법을 수행하는 장치 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018182092A1 true WO2018182092A1 (ko) | 2018-10-04 |
Family
ID=61907154
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2017/006421 WO2018182092A1 (ko) | 2017-03-31 | 2017-06-19 | 영상 기반의 거래 방법 및 이러한 방법을 수행하는 장치 |
Country Status (6)
Country | Link |
---|---|
US (1) | US20210118229A1 (ko) |
EP (1) | EP3605428A4 (ko) |
JP (1) | JP2020511725A (ko) |
KR (2) | KR101843335B1 (ko) |
CN (1) | CN110462666A (ko) |
WO (1) | WO2018182092A1 (ko) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102101382B1 (ko) * | 2018-08-09 | 2020-04-22 | 링크플로우 주식회사 | 영상 공유 방법 및 장치 |
WO2020032371A1 (ko) * | 2018-08-09 | 2020-02-13 | 링크플로우 주식회사 | 영상 공유 방법 및 장치 |
KR102336264B1 (ko) * | 2019-07-03 | 2021-12-09 | 인하대학교 산학협력단 | 매장 내 자동 결제 방법, 시스템 및 프로그램 |
KR102148379B1 (ko) * | 2019-07-24 | 2020-08-26 | 신용강 | 원격 의류매장 서비스 방법 |
KR102181648B1 (ko) * | 2020-08-20 | 2020-11-24 | 신용강 | 원격 의류매장 플랫폼 제공 방법 및 장치 |
JP7445708B2 (ja) | 2022-06-29 | 2024-03-07 | 株式会社Zozo | 情報処理装置、情報処理方法及び情報処理プログラム |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20020084148A (ko) * | 2000-03-10 | 2002-11-04 | 리츠에프엑스 리미티드 | 가상현실 쇼핑시스템용 유저 인터페이스 |
KR100367183B1 (ko) * | 2000-04-25 | 2003-01-09 | 이젠펌 주식회사 | 인터넷 쇼핑몰 구성 및 서비스 방법 |
KR20150022064A (ko) * | 2013-08-21 | 2015-03-04 | (주)인스페이스 | 미러월드 기반 인터랙티브 온라인 쇼핑몰 제품의 판매 지원 시스템 |
KR20160018436A (ko) * | 2014-08-08 | 2016-02-17 | 그렉 반 쿠렌 | 가상 현실 시스템 및 이를 이용한 오디션 게임 시스템 |
KR20170027135A (ko) * | 2015-09-01 | 2017-03-09 | 엘지전자 주식회사 | 이동단말기 및 그 제어방법 |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030133008A1 (en) * | 1999-02-02 | 2003-07-17 | Stanley W. Stephenson | Wearable panoramic imager |
JP2001256364A (ja) * | 2000-03-13 | 2001-09-21 | Kenichi Omae | 購買端末、記録媒体、購買方法、販売サーバ及び販売方法 |
JP2003030469A (ja) * | 2001-07-16 | 2003-01-31 | Ricoh Co Ltd | 仮想現実空間を利用したバーチャルデパートによる商品販売システム、商品販売システム、プログラム、及び記録媒体 |
KR20040011056A (ko) * | 2002-07-27 | 2004-02-05 | (주)샵에프엔 | 3차원 가상현실 쇼핑몰 운영 시스템 및 상품 배치방법 |
JP2007122248A (ja) * | 2005-10-26 | 2007-05-17 | D Net:Kk | 電子ショッピングシステム |
US8370207B2 (en) * | 2006-12-30 | 2013-02-05 | Red Dot Square Solutions Limited | Virtual reality system including smart objects |
KR20090041192A (ko) * | 2007-10-23 | 2009-04-28 | 에스케이 텔레콤주식회사 | 가상의 쇼핑 공간을 제공하는 방법, 시스템 및 서버 |
JP5635709B2 (ja) * | 2011-03-01 | 2014-12-03 | ザ プロクター アンド ギャンブルカンパニー | 物理的小売環境のバーチャル図解上での物理的小売環境に関するデータの表示 |
JP2012190094A (ja) * | 2011-03-09 | 2012-10-04 | Sony Corp | サーバ装置、情報処理方法及びプログラム |
US9836747B2 (en) * | 2011-06-21 | 2017-12-05 | Simon Borrero | System and method for shopping goods, virtualizing a personalized storefront |
US20140095349A1 (en) * | 2012-09-14 | 2014-04-03 | James L. Mabrey | System and Method for Facilitating Social E-Commerce |
JP2015170266A (ja) * | 2014-03-10 | 2015-09-28 | 株式会社ゼンリンデータコム | 施設内案内システム、施設内案内サーバー装置、施設内案内方法および施設内案内プログラム |
KR20170031722A (ko) * | 2014-07-07 | 2017-03-21 | 넥시스 주식회사 | 웨어러블 디바이스를 이용한 정보처리 방법 |
EP3007029B1 (en) * | 2014-10-07 | 2017-12-27 | LG Electronics Inc. | Mobile terminal and wearable device |
TWI540522B (zh) * | 2015-02-26 | 2016-07-01 | 宅妝股份有限公司 | 採用虛擬實境與擴增實境技術的虛擬購物系統與方法 |
KR20160128119A (ko) * | 2015-04-28 | 2016-11-07 | 엘지전자 주식회사 | 이동 단말기 및 이의 제어방법 |
KR101613287B1 (ko) * | 2015-06-15 | 2016-04-19 | 김영덕 | 3d 파노라마 영상을 근거로 한 여행지 원스톱 쇼핑 시스템 및 그의 제어 방법 |
KR101613278B1 (ko) * | 2015-08-18 | 2016-04-19 | 김영덕 | 증강 현실 기반 쇼핑 정보 제공 시스템 및 그의 제어 방법 |
KR101715828B1 (ko) * | 2016-08-24 | 2017-03-14 | 주식회사 팝스라인 | 단말 및 그의 제어 방법 |
-
2017
- 2017-04-08 KR KR1020170045614A patent/KR101843335B1/ko active IP Right Grant
- 2017-06-19 WO PCT/KR2017/006421 patent/WO2018182092A1/ko active Application Filing
- 2017-06-19 JP JP2019551592A patent/JP2020511725A/ja active Pending
- 2017-06-19 CN CN201780088845.2A patent/CN110462666A/zh not_active Withdrawn
- 2017-06-19 US US16/497,101 patent/US20210118229A1/en not_active Abandoned
- 2017-06-19 EP EP17903250.3A patent/EP3605428A4/en not_active Withdrawn
-
2018
- 2018-03-19 KR KR1020180031286A patent/KR102206133B1/ko active IP Right Grant
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20020084148A (ko) * | 2000-03-10 | 2002-11-04 | 리츠에프엑스 리미티드 | 가상현실 쇼핑시스템용 유저 인터페이스 |
KR100367183B1 (ko) * | 2000-04-25 | 2003-01-09 | 이젠펌 주식회사 | 인터넷 쇼핑몰 구성 및 서비스 방법 |
KR20150022064A (ko) * | 2013-08-21 | 2015-03-04 | (주)인스페이스 | 미러월드 기반 인터랙티브 온라인 쇼핑몰 제품의 판매 지원 시스템 |
KR20160018436A (ko) * | 2014-08-08 | 2016-02-17 | 그렉 반 쿠렌 | 가상 현실 시스템 및 이를 이용한 오디션 게임 시스템 |
KR20170027135A (ko) * | 2015-09-01 | 2017-03-09 | 엘지전자 주식회사 | 이동단말기 및 그 제어방법 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3605428A4 * |
Also Published As
Publication number | Publication date |
---|---|
EP3605428A4 (en) | 2020-04-15 |
KR20180111536A (ko) | 2018-10-11 |
EP3605428A1 (en) | 2020-02-05 |
JP2020511725A (ja) | 2020-04-16 |
CN110462666A (zh) | 2019-11-15 |
KR101843335B1 (ko) | 2018-03-29 |
US20210118229A1 (en) | 2021-04-22 |
KR102206133B1 (ko) | 2021-01-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2018182092A1 (ko) | 영상 기반의 거래 방법 및 이러한 방법을 수행하는 장치 | |
WO2017085771A1 (ja) | 精算支援システム、精算支援プログラム、及び精算支援方法 | |
US8571298B2 (en) | Method and apparatus for identifying and tallying objects | |
EP3618427B1 (en) | Video monitoring system | |
WO2015033577A1 (ja) | 顧客行動分析システム、顧客行動分析方法、非一時的なコンピュータ可読媒体及び棚システム | |
US20180181995A1 (en) | Systems and methods for dynamic digital signage based on measured customer behaviors through video analytics | |
WO2012093744A1 (ko) | Rfid 기술 및 영상 기술을 통합한 동선 분석 시스템 및 그 방법 | |
RU2609100C2 (ru) | Способ содействия обнаружению желаемого предмета в месте хранения | |
US20170068945A1 (en) | Pos terminal apparatus, pos system, commodity recognition method, and non-transitory computer readable medium storing program | |
WO2019038965A1 (ja) | 店舗装置、店舗管理方法、プログラム | |
WO2017115946A1 (en) | Automatic product mapping | |
WO2018225939A1 (ko) | 이미지 기반 광고 제공 방법, 장치 및 컴퓨터 프로그램 | |
WO2019038968A1 (ja) | 店舗装置、店舗システム、店舗管理方法、プログラム | |
WO2017038035A1 (ja) | 行動履歴情報生成装置、システム、及び方法 | |
JP2019139321A (ja) | 顧客行動分析システムおよび顧客行動分析方法 | |
WO2018182068A1 (ko) | 아이템에 대한 추천 정보 제공 방법 및 장치 | |
WO2018043859A1 (ko) | 사용자 이미지를 이용한 대여 아이템 선호도 자동 분석 장치 및 이를 이용한 방법 | |
US11978105B2 (en) | System, method, and apparatus for processing clothing item information for try-on | |
JP6886537B1 (ja) | 顧客情報収集端末、顧客情報収集システム及び顧客情報収集方法 | |
WO2020060012A1 (en) | A computer implemented platform for providing contents to an augmented reality device and method thereof | |
WO2018216844A1 (ko) | 촬상 위치 정보를 결정하는 방법 및 이러한 방법을 수행하는 장치 | |
WO2021025237A1 (ko) | 쇼핑 카트 및 그의 상품 인식 방법, 이를 이용한 쇼핑 서비스 제공 시스템 및 방법 | |
WO2024005530A1 (ko) | 실물경제 기반의 메타버스 시스템 및 작동 방법 | |
WO2020145547A1 (ko) | 전자 장치 및 그의 제어 방법 | |
WO2017078383A1 (ko) | 다중 사용자의 영상 콘텐츠 내 상품 좌표 추적 데이터에 대한 실시간 통합 데이터 매핑 장치 및 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17903250 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2019551592 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2017903250 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2017903250 Country of ref document: EP Effective date: 20191031 |