WO2017188696A1 - Procédé, dispositif, et support d'enregistrement pour mettre en place une interface d'utilisateur dans un espace de vr - Google Patents

Procédé, dispositif, et support d'enregistrement pour mettre en place une interface d'utilisateur dans un espace de vr Download PDF

Info

Publication number
WO2017188696A1
WO2017188696A1 PCT/KR2017/004365 KR2017004365W WO2017188696A1 WO 2017188696 A1 WO2017188696 A1 WO 2017188696A1 KR 2017004365 W KR2017004365 W KR 2017004365W WO 2017188696 A1 WO2017188696 A1 WO 2017188696A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
image
space
information
virtual
Prior art date
Application number
PCT/KR2017/004365
Other languages
English (en)
Korean (ko)
Inventor
장부다
이정
한명숙
Original Assignee
장부다
이정
한명숙
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 장부다, 이정, 한명숙 filed Critical 장부다
Publication of WO2017188696A1 publication Critical patent/WO2017188696A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object

Definitions

  • the disclosed embodiment relates to a recording medium having recorded thereon a program for performing a method for providing a user interface in a VR space, a device for providing a user interface in a VR space, and a method for providing a user interface in a VR space.
  • Such virtual reality experience technologies may be realized through a device such as a head mount display (HMD) that allows a user to experience virtual reality by distorting an image output according to the user's field of view through a lens. .
  • HMD head mount display
  • virtual reality is a space in which users can indirectly experience things that are difficult to experience in the real world, it is necessary to provide an environment that can be optimized for each user.
  • a method, a device, and a recording medium for providing a user interface that implements various services required by the user more freely in a physical reality space are provided.
  • a method of providing a user interface (UI) in a VR space by a device displaying a live image and a virtual image regarding a specific location is disclosed.
  • FIG. 1 is a conceptual diagram illustrating a method of providing a user interface (UI) in a VR space by a device according to an exemplary embodiment.
  • UI user interface
  • FIG. 2 is a flowchart illustrating a method of providing a UI in a virtual space by a device according to an exemplary embodiment.
  • FIG. 3 is a diagram for describing a UI of a user provided by a device, according to an exemplary embodiment.
  • FIG. 4 is a diagram for describing a method of providing, by a device, sports training through a UI of a user, according to an exemplary embodiment
  • FIG. 5 is a diagram for describing a method of providing, by a device, a medical service through a UI of a user, according to an exemplary embodiment.
  • FIG. 6 is a diagram for describing a method of providing, by a device, a service to a plurality of users through a UI of a user, according to an exemplary embodiment.
  • FIG. 7 is a diagram for describing a method of providing, by a device, history information of a user through a UI of the user in a VR space according to an embodiment.
  • FIG. 8 is a diagram for describing a method of setting, by a device, a UI of a user through other external servers according to an exemplary embodiment.
  • FIG. 9 is a diagram for describing a method of providing, by a device, a user interface (UI) in a VR space according to an embodiment.
  • UI user interface
  • FIG. 10 is a flowchart illustrating a method of providing a UI in a VR space by a device according to an exemplary embodiment.
  • FIG. 11 is a diagram for describing a method of obtaining, by a device, a user's input and displaying a search box in a VR space, according to an exemplary embodiment.
  • FIG. 12 is a diagram for describing in more detail a UI in a VR space provided by a device, according to an exemplary embodiment.
  • FIG. 13 is a diagram for describing a method of providing, by a device, an advertisement by dividing a VR space in a mesh form.
  • FIG. 14 is a diagram for describing a third virtual space in which a device provides an advertisement according to an exemplary embodiment.
  • 15 is a flowchart illustrating a method of displaying, by a device, an advertisement and a search result in a UI on a VR space.
  • 16 is a flowchart illustrating a method of providing content in a VR space by a device according to an exemplary embodiment.
  • 17 is a diagram for describing in more detail a method of providing, by a device, a live action image and a virtual image of a place where a sporting event is performed.
  • FIG. 18 is a diagram for describing a method of providing, by a device, a combination of a live image and a virtual image of a first location in a VR space;
  • FIG. 19 is a diagram for describing a method of providing, by a device, a combination of a live image and a virtual image of a second position in a VR space, according to an exemplary embodiment
  • FIG. 20 is a diagram for describing a method of providing, by a device, a service to a user at a third location in a VR space, according to an exemplary embodiment.
  • 21 is a diagram for describing a supporter avatar provided by a device, according to an exemplary embodiment.
  • FIG. 22 is a diagram for describing in detail a method of providing, by a device, content related to a sporting event according to an embodiment.
  • FIG. 23 is a diagram for describing a method of providing, by a device, history information of content related to a sports game viewed by a user.
  • 24 is a diagram for describing a method of providing an artificial intelligence guide of a sporting event in a device, according to an exemplary embodiment.
  • FIG. 25 is a diagram illustrating a payment system using a device that provides a virtual reality (VR) space, according to an embodiment.
  • VR virtual reality
  • 26 is a flowchart illustrating a payment method of a device for providing a VR space according to an embodiment.
  • FIG. 27 is a diagram for describing a method of paying, by a device, a right to use a service available through a user interface of a VR space.
  • FIG. 28 is a diagram for describing a method of allowing a device to pay for a product that can be purchased through a user interface of a VR space, according to an embodiment.
  • 29 is a flowchart for describing a method of performing, by a device, payment for a product or a service using an avatar in a VR space.
  • FIG. 30 is a diagram for describing a method of performing, by a device, payment for a product or a service by using an avatar in a VR space.
  • FIG. 31 is a flowchart illustrating a method of a service providing server paying at least one of a product and a service requested by a device.
  • 32 and 33 are block diagrams of devices providing a UI in a VR space according to an embodiment.
  • 34 is a block diagram of a service providing server according to an exemplary embodiment.
  • a method of providing a user interface (UI) in a VR space includes: obtaining identification information from a user of the device; Allocating a preset template UI for the user in the VR space based on the obtained identification information; Receiving profile information and information on an object that the user intends to place in a preset template UI from a user; And arranging the virtual object generated based on the information about the avatar and the object of the user generated based on the profile information on the template UI to generate a UI for the user.
  • a method of providing a user interface (UI) in a VR space includes displaying at least one category of an object that can be placed in a template UI; And displaying the plurality of objects included in the selected category as the user selects any one of the at least one category.
  • UI user interface
  • a device providing a user interface (UI) in a VR space may include displaying payment information necessary for purchasing an object as information about the object is received; And receiving an input regarding a payment means corresponding to the payment information from the user.
  • UI user interface
  • a method of providing a user interface (UI) in a VR space may include displaying at least one advertisement on one side of an allocated UI;
  • a method of providing a user interface (UI) in a VR space includes: obtaining biometric information of a user wearing a device according to a preset period; And generating health information of the user based on the obtained biometric information.
  • a method of providing a user interface (UI) in a VR space may include generating at least one of medical information and recommended sports information about a user based on the generated health information; And displaying the generated information on the assigned UI.
  • UI user interface
  • a method of providing a user interface (UI) in a VR space includes: detecting a change in a state of a user while displaying a UI in a VR space; And changing the avatar of the user according to the detected change.
  • UI user interface
  • a device for providing a user interface (UI) in a VR space may include: a sensing unit configured to obtain identification information from a user of a device; On the basis of the obtained identification information, a preset template UI for the user is allocated in the VR space, and profile information and information about an object that the user wants to place in the preset template UI are received from the user, and based on the profile information
  • a processor for generating a UI for the user by disposing a generated virtual object on the template UI based on the generated information about the avatar and the object of the user; And an output unit for displaying a UI for the generated user.
  • a method of providing a UI in a VR space by a device includes: displaying a search box on the VR space as a user input is obtained; Retrieving information about the object as a search request for the object is received through the displayed search box; Displaying at least one advertisement on the VR space while retrieving information about the object; And stopping the display of the at least one advertisement and displaying the information about the found object on the VR space as the search is completed.
  • a method of providing a UI on a VR space by a device includes selecting at least one advertisement from among a plurality of advertisements based on a profile of a user and history information about at least one object previously requested to be searched. It further comprises the step.
  • displaying includes moving an image constituting at least one advertisement in a preset direction, wherein a moving speed of the image is The closer the end time of the search for the object is, the faster it is.
  • a method of providing a UI on a VR space by a device may include receiving information about a product of a selected advertisement when a selection input for selecting at least one advertisement displayed by a user is received while searching for information about an object. And displaying on the VR space.
  • the displaying may include displaying the search result and information about a product of the selected advertisement together in the VR space when a search for an object is completed. do.
  • a user input includes a user's gesture
  • the search box determines a size, a position, and a shape of the search box according to the user's gesture.
  • a device for providing a user interface (UI) in a VR space may include: a sensing unit configured to obtain a user input; An output unit configured to display a search box on the VR space as a user input is obtained; And as a search request for the object is received through the displayed search box, while searching for information about the object and displaying information about the object, displaying at least one advertisement in the VR space, and as the search is completed. And a processor to stop displaying the at least one advertisement, wherein the output unit displays the information about the retrieved object on the VR space.
  • a method of providing a UI in a VR space by a device includes: receiving a content request regarding a sporting event from a user; Acquiring a photo-realistic image and a virtual image of a place at which a sporting event proceeds as the content request is received; And displaying, according to the user's gesture, a live action image and a virtual image of a specific location at a place where a sporting event is held.
  • a method of providing a UI in a VR space includes: determining a location in a place where the sports event corresponding to a user's gesture is performed; And receiving a live image and a virtual image for the determined position.
  • the live-action image includes an image photographed at each of a plurality of viewpoints of a sporting event, and the displaying of the UI is performed at a first viewpoint at a specific location.
  • the photographed image is displayed by combining at least one image photographed at a different point in time from the first view point.
  • a method of providing content in a VR space includes: transmitting purchase request and payment information about an object displayed on at least one of a live action image and a virtual image of a specific location; And when the purchase of the object is completed, displaying the virtual image of the object on the VR space.
  • a method of providing a UI in a VR space including: generating history information of at least one content watched by a user before providing content related to a sporting event; If one of the at least one content is selected, displaying history information of the selected content on the VR space; And when the user selects a location included in the displayed history information, displaying a live image and a virtual image of the selected location on the VR space.
  • a method of providing a UI in a VR space includes selecting a user who uses content related to a sporting event among a plurality of other users connected through a social network service (SNS) account of the user; Transmitting an invitation message to the VR space to a device of the selected user; And as the acceptance message for the invitation message is received, displaying a virtual image representing another user on the VR space.
  • SNS social network service
  • a method of providing a UI in a VR space by a device includes: receiving profile information of a user; And generating an avatar image representing the user based on the received profile information, and displaying the combined avatar image together with the live image and the virtual image to display the avatar image on the VR space.
  • a device for providing a UI in a VR space includes: a sensing unit configured to receive a content request regarding a sports event from a user; A processor for determining a location at which a sporting event proceeds as a content request is received; A communication unit for obtaining a live-action image and a virtual image of the determined place; And an output unit for displaying a live action image and a virtual image of a specific position at a place where a sporting event is conducted according to a gesture of the user.
  • a payment method using a device that provides a UI in a VR space receives a user input of selecting one of items related to at least one of goods and services displayed on a user interface of a VR space provided by the device. Doing; Transmitting payment information of the item and identification information associated with the payment means, as the payment means for paying for the selected item is selected; And when the settlement of the selected item is completed, displaying the settlement completion information on the user interface.
  • the VR space is divided into a plurality of places according to at least one kind of goods and services provided in the VR space, Receiving, in the VR space, information about costs required to access at least one of the plurality of places.
  • a payment method using a device that provides a UI in a VR space includes displaying payment information regarding a cost required to access a target location selected based on a user's request from among a plurality of locations; And if the payment of the cost required to access the target space is completed, displaying the avatar generated based on the user's profile in the target space.
  • a payment method using a device that provides a UI in a VR space may include identification of a user corresponding to at least one payment means that can be provided by the device to attribute information of an avatar on a VR space created based on a user's profile.
  • the method may further include inputting information, and the transmitting may include transmitting attribute information of the avatar and payment information of the item to a first server providing at least one of goods and services, and extracting the user from the avatar attribute information.
  • the identification information and the payment information of the item are transmitted from the first server to the payment server that performs the payment of the item.
  • the payment method using a device that provides a UI in a VR space may further include displaying a virtual image related to the selected item on a user interface of the VR space as the payment is completed.
  • a payment method using a device that provides a UI in a VR space includes: receiving a user input of selecting one of a plurality of payment means that can be provided by the device; And acquiring preset identification information for the selected payment means.
  • the identification information includes at least one of an iris signal, an EEG signal, and a pulse signal of the user.
  • a device for providing a UI in a VR space includes: a sensing unit configured to receive a user input for selecting one of items related to at least one of a product and a service displayed on a user interface of the VR space; A processor for selecting a payment means for paying for the selected item; As the payment means is selected, the communication unit for transmitting the identification information associated with the payment means and payment information of the item; And an output unit for displaying the settlement completion information on the user interface when the settlement of the selected item is completed.
  • any part of the specification is to “include” any component, this means that it may further include other components, except to exclude other components unless otherwise stated.
  • the terms “... unit”, “module”, etc. described in the specification mean a unit for processing at least one function or operation, which may be implemented in hardware or software or a combination of hardware and software. .
  • FIG. 1 is a conceptual diagram illustrating a method of providing a user interface (UI) in a VR space by a device 100 according to an exemplary embodiment.
  • UI user interface
  • the device 100 may implement a VR environment by outputting an image on a display through a pre-made lens in order to make the user feel as if the user is located in a new space different from the space in which the user is currently located.
  • the device 100 may provide a UI for allowing a user to directly create a space so that the user may own a space unique to the user in the VR space.
  • the device 100 may provide a preset template UI 100 to the user.
  • the preset template UI 100 may include icons 110, 120, 130, 140, and 150 necessary for the user to create his or her space or control the VR space.
  • Icons 110, 120, 130, 140, and 150 required to control the VR space include a toolbar icon 110, a home icon 120, a favorite icon 130, a settings icon 140, and a preview icon 150. ) May be included.
  • the toolbar icon 110 may include icons required for placing, removing, and editing an object desired by the user on the template UI 100.
  • the home icon 120 may move a user at another place to the home in the VR space.
  • the home refers to a space that the user directly produces in the VR environment.
  • a web server not shown
  • the user may move to the home by touching the home icon 120.
  • the favorite icon 130 may include location information on the web of a place that the user mainly visits, among various places located on the VR space. For example, the user stores the address information for the first place and the second place that the user frequently visits in the VR space in the favorite icon 130, and selects the favorite icon 130 in another place to record. It is easier to move to the place.
  • the setting icon 140 may include profile information of the user, profile information of other users connectable to the device 100 of the user, and community information on the VR space including the user.
  • the user may select the setting icon 140 to update previously stored user profile information, profile information of other users, community information on the VR space, and the like.
  • the preview icon 150 is a window in which a search result of other places, objects, etc. in the VR space searched by the user is displayed in a preview format.
  • the device 100 may identify and show the VR space where the user is located and the search result by displaying the search result searched by the user through the preview icon 150.
  • the device 100 may generate a UI for the user by arranging a virtual object selected by the user on the preset template 100.
  • the device 100 may generate not only the virtual object but also the user's avatar together on the template UI to generate a UI for the user.
  • the device 100 may display the created UI of the user first, and may provide an environment in which the user may enjoy various entertainment activities in his / her own home in the VR space.
  • the user-created UI by placing a variety of virtual objects that do not have in the real world on the user-created UI, it can be made to the user to feel the surrogate satisfaction.
  • a detailed method of setting the UI of the user by arranging the virtual object on the preset template 100 by the user will be described later with reference to FIG. 3.
  • the device 100 may be implemented in various forms.
  • the device 100 described in the present specification may be a mobile phone, a smart phone, a laptop computer, a tablet PC, an electronic book terminal, a digital broadcasting terminal, a personal digital assistant (PDA), a PMP ( A portable multimedia player), a navigation device, a smart TV, a smart car, a consumer electronics (CE) device (eg, a refrigerator having a display panel, an air conditioner, etc.), a head mounted display (HMD), and the like may be used, but is not limited thereto.
  • PDA personal digital assistant
  • PMP A portable multimedia player
  • CE consumer electronics
  • HMD head mounted display
  • FIG. 2 is a flowchart illustrating a method of providing a UI in a virtual space by a device according to an exemplary embodiment.
  • step S210 the device obtains identification information from the user of the device.
  • the device may display a message requesting identification information to the user before setting a home space which is a user's own space on the VR space.
  • the user may input identification information of the user using at least one of a touch, a gesture, and a voice.
  • the identification information of the user may include an ID and a password of the user, but this is only an example, and the identification information is not limited to the above-described example.
  • the device allocates a preset template UI for the user in the VR space based on the obtained identification information.
  • the device 100 may allocate a preset template UI to the user.
  • the device 100 may exchange information in real time with a home management server that sets and manages a home space for each of a plurality of users.
  • the device 100 may directly allocate a preset template UI or receive a template UI assigned by the home management server based on user identification information.
  • the device 100 receives profile information and information about an object that the user intends to place in a preset template UI from the user.
  • the profile information of the user may include information about a job, age, gender, body information, and a photograph of the user.
  • the object may include objects such as furniture, transportation means, electronic devices, sports equipment, and clothing, plants such as flowers and trees, and animals such as dogs, cats, and birds.
  • the device 100 may detect a voice, a gesture, a touch signal, or the like of a user, and receive information about an object to be placed by the user. For example, when the user speaks "A car", the device 100 may detect a voice signal "A car" and determine that the object that the user wants to place in the template UI is A car.
  • the device 100 may provide images of various objects included in a category when a user inputs a category of an object through a search window on a VR space. For example, when the user inputs the term semi-medium car, the device 100 may display an image of cars included in the semi-medium car. The user may select one of the images of the cars displayed through the device 100 through a touch input or a gesture.
  • the device 100 In operation S240, the device 100 generates a UI for the user by arranging the virtual object generated based on the information about the avatar and the object of the user generated based on the profile information on the template UI.
  • the device 100 may generate an avatar of the user in the VR space based on the profile information. For example, the device 100 may generate an avatar having a similar body shape and appearance to the user based on the user's body information, age, and photo included in the profile information. According to another example, the device 100 may obtain information about the appearance desired by the user in addition to the profile information of the user and generate the avatar of the user based on the information.
  • the device 100 may arrange the generated avatar on the template UI.
  • the user may control the position, movement and posture of the avatar on the template UI through touch, gesture, and voice input.
  • the device 100 may obtain a virtual object corresponding to an object that the user wants to place as information about an object that the user wants to place on the template UI is obtained.
  • the virtual object represents a three-dimensional image of the object in the virtual space.
  • the device 100 may request a user to pay for the virtual object in order to obtain the virtual object.
  • the device 100 may request a virtual object from a server of a seller selling a virtual object corresponding to an object selected by a user.
  • the server of the seller who has received the request requests the device 100 to pay a payment means for paying the amount set for the virtual object, and the device 100 pays the requested amount by using a predetermined payment means from the user. Can be.
  • the device 100 may obtain a virtual object.
  • the device 100 may set a session with a server of a seller and a predetermined payment server to pay a cost for obtaining a virtual object.
  • the device 100 may store a template UI in which an avatar and a virtual object are disposed as a UI for a user.
  • Information about the UI of the user stored by the device 100 may be transmitted to the home management server.
  • the home management server may store the received information about the UI of the user by matching the identification information of the user. Accordingly, when the user later accesses the VR space and inputs identification information of the user, the user UI may be received from the home management server.
  • FIG. 3 is a diagram for describing a UI 300 of a user provided by the device 100, according to an exemplary embodiment.
  • the device 100 may load and display the UI 300 of the user in a VR space previously set by the user.
  • the UI 300 of the user may be stored in the device 100 or may be stored in the home management server.
  • the UI 300 of the user corresponds to the toolbar icon 110, the home icon 120, the favorite icon 130, the setting icon 140, and the preview icon 150 described above with reference to FIG. 1.
  • Icons 310, 320, 330, 340, and 350 required to control the VR space may be included.
  • the device 100 may display the advertisement 360 on the UI 300 of the user.
  • the advertisement 360 may select the profile information of the user.
  • the device 100 may not display the advertisement 360 on the UI 300 of the user according to the user's selection.
  • the user may obtain a point for viewing the advertisement by displaying the advertisement 360 on the UI 300 of the user.
  • the device 100 may obtain information regarding viewing of the advertisement 360 by recording the information about the type and the number of the displayed advertisement 360 and providing the recorded information to the point providing server.
  • the device 100 may display a search box 370 on the UI 300 of the user.
  • the search box 370 may be displayed when the user wants to search for information desired by the user through a pen, finger, keyboard, or the like.
  • the user may display the search box 370 on the UI 300 of the user through signals such as voice, gesture, and touch.
  • the device 100 may display at least one virtual object 380 and 390 on the UI 300 of the user.
  • the device 100 may display the first virtual object 380, which is a 3D image of the boat, on the UI 300 of the user.
  • the first virtual object 380 may be a live image of a boat of a kind desired by the user or may be a graphic image.
  • the device 100 may display the second virtual object 390, which is a three-dimensional image of the dock, on the UI 300 of the user.
  • the device 100 may provide an environment in which the user may feel intimacy with the VR space by setting the UI 300 of the user as a user's own space on the VR space.
  • the above-described example is merely an example for describing the UI 300 of the user.
  • various virtual objects may be disposed on the UI 300 of the user. For example, virtual objects for various objects such as vehicles, airplanes, pets, physical trainers, personal robots, and attending physicians may be placed on the user's UI 300.
  • FIG. 4 is a diagram for describing a method of providing, by the device 100, sports training through the UI 400 of a user, according to an exemplary embodiment.
  • the device 100 may provide training information 415 to the user through the UI 400 of the user.
  • the training information 415 may include an image, text, a diet control table, and the like regarding the exercise motion. The user may enjoy the exercise regardless of the place through the training information 415 without visiting the gym.
  • the device 100 may arrange the avatar 410 of the user on the UI 400 of the user.
  • the device 100 may detect a user's motion while exchanging data sensed in real time with the exercise device 405 equipped with a motion sensor including an acceleration sensor, a gyroscope sensor, a barometric pressure sensor, and the like.
  • the device 100 may control the movement of the user's avatar 410 disposed on the UI 400 of the user according to the detected user's motion.
  • the device 100 may reflect the effect of the exercise performed by the user on the avatar 410 so that the user may feel a sense of achievement. For example, when the user performs the upper body exercise, the device 100 may change the existing avatar 410 into an avatar 420 having an increased muscle of the upper body.
  • the device 100 may provide a place in a VR space in which a group exercise such as soccer or baseball may be performed through connection with at least one other user's device.
  • the device 100 may set a space that can be jointly accessed with at least one other user's device.
  • the method of setting a space that is commonly accessible may correspond to the method described above with reference to FIG. 1.
  • the plurality of users may arrange virtual objects necessary for enjoying a group exercise such as soccer and basketball while communicating with each other in the provided template UI.
  • FIG. 5 is a diagram for describing a method of providing, by the device 100, a medical service through a UI 500 of a user, according to an exemplary embodiment.
  • the device 100 may provide a medical service to a user through the UI 500 of the user.
  • the user may display the user's doctor's virtual object 530 on the user's UI 500 through the toolbar icon 110 described above with reference to FIG. 1.
  • the device 100 may receive information regarding the health state of the user while transmitting and receiving data in real time with the device of the user's doctor or the server of the hospital where the doctor is located.
  • the device 100 may transmit the biometric information of the user measured through the sensor worn by the user to the device of the user's doctor or the server of the hospital.
  • the device of the doctor of the user or the server of the hospital may transmit the medical information 510 analyzing the current state of health of the user to the device 100 based on the received biometric information of the user.
  • the device 100 may display the received medical information 510 on the UI 500 of the user.
  • health care information 520 such as a prescription, a diet, or a method of taking a drug, which is determined based on the user's health state, may be displayed.
  • the device 100 may receive medical treatment and psychological counseling while conducting a call in real time with a user's doctor through a communication session connected with a device of a doctor of a user.
  • the device 100 may provide the medical service environment in which the user may feel more realistic by displaying the virtual object 530 of the user's doctor on the UI 500 of the user.
  • FIG. 6 is a diagram for describing a method of providing, by a device 100, a service to a plurality of users through a UI 600 of a user, according to an exemplary embodiment.
  • the device 100 may be connected to devices 20 and 30 of a plurality of different users.
  • the plurality of other users represent users who are preset to share some information with each other in the VR space with the user.
  • the plurality of other users may be searched through contact information previously stored in the device 100 or may be searched by the user directly inputting profile information on the VR space.
  • the device 100 may enjoy the entertainment service with other users by displaying the virtual object selected by the user and the virtual objects selected by the other users on the UI 600 of the user.
  • the virtual object 630 of the third vehicle selected by the user of the third device 30 may be displayed.
  • the virtual object 610 of the first car, the virtual object 620 of the second car, and the virtual object 630 of the third car may be virtual objects provided by a Massive Multiplayer Online Role Playing Game (MMORPG) service. have.
  • MMORPG Massive Multiplayer Online Role Playing Game
  • the device 100 displays not only a user's virtual object but also another user's virtual object on the user's UI 600, and displays data in real time with another user's device (eg, 20). You can enjoy services such as games by using virtual objects while transmitting and receiving.
  • FIG. 7 is a diagram for describing a method of providing, by the device 100, history information of a user through the UI 700 of the user in a VR space, according to an exemplary embodiment.
  • the device 100 may reproduce a past situation based on an image, a video, or the like of a user stored in the device 100.
  • the device 100 may reproduce the first history information.
  • the first history information is a video in which the user cheers a soccer game with friends in the past
  • the device 100 may edit the video as a 3D image and display the video on the UI 700 of the user.
  • the device 100 may reproduce the second history information.
  • the second history information is a video photographing a scene sent with a grandmother of a user who has passed in the past
  • the device 100 reproduces the grandmother of the user who has passed in the past in a VR space, so that the user can remember past memories. It can help you remember.
  • the device 100 allows a user to display a thumbnail image 730 for the third history information, a thumbnail image 750 for the fourth history information, a thumbnail image 750 for the fifth history information, and a sixth.
  • the selected history information may be reproduced and displayed on the user interface 700.
  • FIG. 8 is a diagram for describing a method of setting, by the device 100, a UI of a user through other external servers, according to an exemplary embodiment.
  • the device 100 may transmit / receive information related to the UI 810 of the user in real time through the home management server 810.
  • the home management server 810 is a server that manages identification information of a plurality of users and UIs of a plurality of users.
  • the device 100 may load the UI of the pre-stored user by inputting the identification information of the user input from the user to the home management server 810.
  • the UI of the user may be generated by allocating a preset UI template to the user based on the received identification information of the user as described above with reference to FIG. 1. have.
  • the device 100 may search the web for an object to be placed on the UI of the user.
  • the device 100 may select a desired object among the searched objects and request a virtual object for the selected object.
  • the home management server 810 may provide an environment in which the device 100 may easily purchase a virtual object by providing information of the device 100 to the seller server 820 providing the requested virtual object.
  • the device 100 may pay the virtual object through direct communication with the seller server 820, or may pay the virtual object through a connection with a payment means providing server (not shown) to which the seller server 820 is subscribed.
  • FIG. 9 is a diagram for describing a method of providing, by a device, a user interface (UI) in a VR space 100 according to an embodiment.
  • UI user interface
  • the device 100 may provide a UI including a search box 910 for searching for information in the VR space 900.
  • the device 100 may display the search box 910 on the UI in the VR space 900 as a user input for requesting the display of the search box 910 is obtained from the user.
  • the device 100 may display the input information in the search window 910.
  • the device 900 may display the name smart device A on the search window 910.
  • the device 100 may search for information about the object from a web server connected to the device 100.
  • the device 100 changes an image on the VR space 900 while searching for information about an object, thereby providing an environment in which the user may acquire new information on the UI while searching for information about the object.
  • the device 100 may include an advertisement in an image displayed on the VR space 900.
  • the advertisement may be selected based on the user's profile information such as the user's age, gender, occupation, or the like, or may be selected based on a search word previously searched by the user.
  • a method of providing an advertisement while the device 100 searches for information about an object will be described below in detail with reference to FIGS. 12 to 14.
  • the device 100 may move the plurality of advertisement images displayed in the VR space 900 in a predetermined direction so that the user may feel like moving to another space. have. Also, the speed at which the advertisement image is moved may be set faster as the end time of the search for the object gets closer.
  • the device 100 may receive and display information about the selected advertisement image.
  • the device 100 may continue the search requested by the user while the information about the advertisement image is displayed. For example, when a plurality of advertisement images are displayed while moving in a predetermined direction on the VR space 900 of the device 100, information about a product of the advertisement image touched by the user is displayed on the VR space 900. May be displayed.
  • a method of selecting one of the plurality of advertisement images by the user is not limited to the touch method. According to another example, the user may select one of the plurality of advertisement images through voice input.
  • the device 100 may stop displaying the plurality of advertisement images and display information on goods of the selected advertisement image on the VR space.
  • the device 100 may display information about the searched object on the VR space 900.
  • the device 100 may display the selected advertisement image and the information about the object together.
  • the device 100 may display only information about the object on the VR space 900.
  • the device 100 may provide a UI in which a plurality of advertisement images move in one direction on the VR space 900 while the user requests to retrieve information about an object in the VR space 900. It is possible to provide an environment in which a variety of information can be efficiently obtained during the time required for searching.
  • FIG. 10 is a flowchart illustrating a method of providing a UI in a VR space by a device according to an exemplary embodiment.
  • the device may display a search box on the VR space.
  • the device may provide a VR space.
  • the VR space may be implemented by projecting an image displayed on the display of the device through a lens having a predetermined angle of view.
  • the lens having the preset angle of view may be included in the device, but this is only an example, and the lens having the preset angle of view may exist outside the device.
  • the device may display the search box on the VR space when a user input for requesting the search box is obtained from the user.
  • the device stops the service and displays the search box or displays the search box with the service. can do.
  • the device may search for information about the object as a search request for the object is received through the displayed search window.
  • the device may receive a search request for an object through the displayed search box.
  • the user may specify an object to be searched by using an input means, a gesture, a voice signal, or the like, which can be touched.
  • the device may display the specific object in the search box for the user to check.
  • the device may receive the information about the specified object A and display the object A in the search box.
  • the device may search for information about the object A as a search request for the object A is received.
  • the device may connect with external servers, such as a web server, to retrieve information about the object.
  • the device may display at least one advertisement on the VR space while searching for information about the object.
  • the device may display an advertisement selected according to the user's profile on the VR space while searching for information about the object. For example, if the user is a male in his thirties, the device may display an advertisement regarding a product preferred by the male in his thirties.
  • the device may move and display the advertisement image so that the user may feel as if moving from the current position to the other position in the VR space.
  • the series of advertisement images may be moved in the first direction based on the position of the user.
  • the device may divide the VR space into a mesh form so that more advertisements may be displayed in each divided area. Accordingly, the user may watch at least one advertisement until the search for the object is completed.
  • the device may stop displaying the advertisement and additionally provide information on the selected advertisement. This will be described later in more detail with reference to FIGS. 14 and 15.
  • the device may stop displaying the at least one advertisement and display information about the found object on the VR space.
  • the device may stop displaying the at least one advertisement.
  • the device may display at least one of an image, text, and a video included in the information about the searched object on the VR space.
  • the device may display information on the searched object and information on the product of the selected advertisement together in the VR space.
  • FIG. 11 is a diagram for describing a method of displaying, by a device, a search box 1120 on a VR space 1110 by obtaining a user input.
  • the device may detect a gesture of a user.
  • the device may detect a user's gesture of drawing a rectangle with a finger.
  • the device may generate an image projected through a lens having a predetermined angle of view so that the search window 1120 may be displayed at a position where a user gesture is detected in the VR space 1110.
  • the device may determine the size and shape of the search box displayed according to the type of the detected gesture of the user. For example, when a gesture for drawing a rectangle is obtained, the device may display a search box 1120 having a rectangular shape. According to another example, when a circular gesture is obtained, the device may display a circular search box. The device may determine the size of the search box displayed based on the user's gesture.
  • the user may request information about the object from the device.
  • the device may provide information about the object as text data.
  • a user of the device may provide information about an object by dragging an image previously stored in the device to the search window 1120.
  • a user of a device may provide information about an object as voice data.
  • the device may search for the object as the object that the user wants to search through the search box 1120 is specified.
  • FIG. 12 is a diagram for describing in more detail a UI in a VR space provided by the device 100, according to an exemplary embodiment.
  • the device 100 may provide a first virtual space 1210, which is a space where a search window is displayed, and a second virtual space 1220, which is a space where a search result is displayed.
  • the device 100 may provide a third virtual space 1230 in which an advertisement is displayed while the search for the object is in progress.
  • the first virtual space 1210, the second virtual space 1220, and the third virtual space 1230 may be distinguished according to the type of information displayed in the VR space provided by the device 100.
  • the device 100 may implement the first virtual space 1210, the second virtual space 1220, and the third virtual space 1230 on the VR space over time.
  • the device 100 may display a search box in a VR space as a user input for requesting a search is received from the user.
  • the space where the search box is displayed according to the user's input may be described as the first virtual space 1210.
  • the device 100 may display at least one advertisement in the VR space until the search is completed.
  • the space where the VR space is displayed may be described as the third virtual space 1230.
  • the device 100 may implement the third virtual space 1230 by changing the UI of the VR space to at least one image, text, and video of the advertisement.
  • the device 100 may display an advertisement image, text, and video constituting an advertisement while moving in a predetermined direction, thereby providing an environment in which the user may feel a movement of the virtual space.
  • the device 100 may display the second virtual space 1220 provided with the search result. As the search is completed, the device 100 stops the advertisement displayed while moving on the VR space, thereby passing the user through the third virtual space 1230 and having the same effect as being located in the second virtual space 1220. Can provide.
  • FIG. 13 is a diagram for describing a method of providing, by a device, an advertisement by dividing a VR space in a mesh form.
  • the device may display a third virtual space for providing an advertisement.
  • the device divides the VR space into a mesh form so that a user moves to a third virtual space providing an advertisement so that a plurality of advertisement images can be displayed.
  • the plurality of advertisement images displayed in the divided area in the mesh form may move in a predetermined direction.
  • the device may display information about the product of the selected advertisement in the VR space. Can be.
  • the device may change images of an advertisement moving on the third virtual space according to a search time.
  • the speeds of the advertisement image 1310 moving at the start of the search, the advertisement image 1320 moving at the middle of the search, and the advertisement image 1330 moving at the last of the search may be different from each other.
  • the speed of the advertisement image 1330 moving in the last stage of the search close to the time when the search ends may be the fastest.
  • FIG. 14 is a diagram for describing a third virtual space 1410 in which the device 100 provides an advertisement, according to an exemplary embodiment.
  • the device 100 may provide a third virtual space 1410 in which a plurality of advertisements are displayed in the VR space. An image constituting each of the plurality of advertisements may be displayed in the third virtual space 1410.
  • an image constituting each of the plurality of advertisements may move in a predetermined direction.
  • the device 100 detects the gaze direction of the user and displays an image constituting each of the plurality of advertisements in a direction opposite to the gaze direction of the user. You can move it.
  • the series of images constituting each of the plurality of advertisements may move in the second direction opposite to the first direction without disconnection until the device 100 completes the search requested by the user.
  • the user of the device 100 may select any one of advertisement images moving by moving around the user.
  • the device 100 may select one of the plurality of advertisements through a gesture of touching one of the advertisement images.
  • the device 100 may detect a gesture of the user and select an advertisement corresponding to the gesture of the user from among advertisement images actually displayed on the display of the device 100. Since the user feels that the image displayed on the device 100 is at a distance through a lens having a preset angle of view, the advertisement selected by the user is matched by matching the user's gesture with the advertisement actually displayed on the device 100. It can determine what kind of advertisement.
  • the device 100 may display the information 1420 of the vehicle A, which is a product of the advertisement A, on the VR space.
  • the device 100 may stop displaying other advertisements as the advertisement A is selected, and display the information 620 of the vehicle A on the VR space.
  • the device 100 may display the information 1420 of the car A until the search is completed, and when the search is completed, the device 100 may implement a second virtual space in which the search result is displayed on the VR space. have.
  • the device 100 may preset the type of advertisement displayed on the third virtual space 1410 based on the profile of the user.
  • 15 is a flowchart illustrating a method of displaying, by a device, an advertisement and a search result in a UI on a VR space.
  • the device may search for information about the object as a search request for the object is received through the displayed search window.
  • the device may select at least one advertisement from among the plurality of advertisements based on the user's profile and history information about the at least one object previously requested to be searched.
  • the device may provide profile information of a user to an advertisement server that provides an advertisement, and may receive information about products that are preferred by other users having a profile similar to that of the user. Also, according to another exemplary embodiment, the device may be provided with an advertisement about a product determined to be of high interest by providing information of an object previously searched by the user to an advertisement server providing an advertisement.
  • the device may display the selected advertisement while searching for information about the object.
  • the device may display the selected advertisement on the VR space while searching for information about the object is in progress.
  • the device may stop displaying the at least one advertisement and display information about the found object on the VR space.
  • the device may stop displaying the at least one advertisement.
  • the device may display at least one of an image, text, and a video included in the information about the searched object on the VR space.
  • the device may display information on the searched object and information on the product of the selected advertisement together in the VR space.
  • 16 is a flowchart illustrating a method of providing content in a VR space by a device according to an exemplary embodiment.
  • the device may receive a content request regarding a sporting event from a user wearing the device.
  • the device may receive a content request from the user that includes information regarding the type of sporting event.
  • the information on the type of sports event may include information on at least one of a place, a date, and a type of team participating in a sporting event.
  • the user may provide information regarding the type of sports event to the device through input signals such as voice and touch.
  • the device may display a screen for inputting information regarding the type of sports event.
  • the user may input information regarding the type of sporting event on the displayed screen.
  • the device may recognize the user's voice to obtain information on the type of the sports event.
  • the device may acquire a photorealistic image and a virtual image of a place where a sporting event is held.
  • the device may request a live image and a virtual image from a server that generates a live image and a virtual image of a place where a sporting event is held.
  • the device may request a live-action image and a virtual image of the entrance of the stadium A where the sporting event is held.
  • the due diligence image may be an image photographing a situation when the server receives a request from the device.
  • the virtual image may be a graphic image generated to allow a user to feel a sporting event more realistically in a VR space.
  • the virtual image may include images of advertisements, virtual stores, and virtual museums provided by sponsors sponsoring a game performed in the stadium.
  • the virtual image may include a supporter avatar representing the user.
  • the device may display a live action image and a virtual image of a specific location at a place where a sporting event is conducted according to a gesture of the user.
  • the device may acquire a photorealistic image and a virtual image of a specific location at a place where a sporting event is conducted.
  • the place where the sporting event is conducted may be distinguished into a plurality of locations.
  • a place where a sporting event is held may be divided into a central station, a central road, a square, and a stadium.
  • the device may acquire and display a live image and a virtual image of the central road, which is the next location of the central station, from the server.
  • the user may control a device to display a virtual image and a live image of a desired position by inputting a voice to the device at a position that the user wants to move.
  • the device may communicate in real time with at least one server that generates a virtual image and a live image of a plurality of locations included in a place where a sporting event is held, thereby realizing a virtual image of a location corresponding to a user gesture.
  • the request for the virtual image and the live image transmitted from the device to the server may include identification information regarding a location corresponding to the gesture of the user.
  • 17 is a diagram for describing in more detail a method of providing, by a device, a live action image and a virtual image of a place where a sporting event is performed.
  • the device when the device receives a content request for a sporting event from a user, the device may obtain a photorealistic image and a virtual image of A1, which is a place where a sporting event is conducted.
  • A1 which is a place where a sports game is conducted, may be distinguished into a plurality of locations. Information about the plurality of locations may be displayed through the device as a map 1700 of the stadium.
  • the device when the device receives a content request for a sporting event from a user, it is assumed that a real image and a virtual image regarding a location A 1710, which is an initial location, are acquired at a location A1.
  • the device may display the due diligence image and the virtual image for location A 1710. Meanwhile, the virtual image may include a map of place A1, a supporter avatar, and an advertisement.
  • the device may display the live image and the virtual image in combination according to the rendering information received from the server.
  • the rendering information may include stitching information in which the live image and the virtual image are combined, location information on the live image in which the virtual image is disposed, and the like. An example in which the live image and the virtual image are combined will be described later with reference to FIGS. 18 to 20.
  • the device may detect a gesture of a user and determine a location in A1 that the user wants to move.
  • a location within A1 that a user can move may include location A (1710) which is a central station, location B (1720) which is a central road, location C (1730) which is a square, and location D (1740) of a stadium. have.
  • location C (1730) which is a square a host city public hall (1731), a souvenir shop (1732), a soccer game museum (1733), a toto shop (1734), a local partner showroom (1735), a FIFA world game museum (1736)
  • the global partner showroom 1735 and the opponent team square 1738 may be additionally included.
  • the user may move to the position A2 1750 which is the opposing team position of the team selected by the user according to the selection.
  • the device may generate an avatar representing the user or provide information about another user who uses content related to a sporting event.
  • FIG. 18 is a diagram for describing a method of providing, by a device, a combination of a photorealistic image 1820 and a virtual image 1810 and 1830 for a first location in a VR space.
  • the device may acquire a photorealistic image 1820 and a virtual image 1810 and 1830 of the first location.
  • the device may receive the live image 1820 and the virtual image 1810, 1830 in real time from a server that generates the live image 1820 and the virtual images 1810, 1830 of the first location. Can be.
  • the device may receive an image in which the photorealistic image 1820 and the virtual images 1810 and 1830 are combined, and the stitching information for combining the photorealistic image 1820 and the virtual images 1810 and 1830 together. You may.
  • the virtual images 1810 and 1830 may include an advertisement image 1832 of a company sponsoring a sporting event or a host city of a sporting event.
  • the user may select the first virtual image 1810 and move to another location connected to the first virtual image 1810.
  • the user may move to the avatar selection and community invitation room 1705 described above with reference to FIG. 17 by selecting the first virtual image 1810.
  • the user may move from the moved location to a location where the user can connect with another user to use the content related to the sporting event.
  • the user may select the first virtual image 1810 to select an avatar representing the user.
  • the device may acquire an image of the user.
  • the image of the user may be obtained from a 3D scanning device or the like that scans the user, but this is only an example, and the image of the user may be stored in advance in the device.
  • the device may generate a supporter avatar based on the obtained user image.
  • the supporter avatar may be used to perform a process such as authentication when moving to another location later. For example, in a place offering a game, it is necessary to confirm whether a fee is paid to move to a location other than the current location. In this case, the device may recognize the supporter avatar representing the user to the server managing the other location, thereby providing information that the user paid the fee.
  • the device may more realistically provide an environment required for a user to watch a sporting event in a VR space by displaying an image in which the photorealistic image 1820 and the virtual images 1810 and 1830 for the first location are combined.
  • FIG. 19 is a diagram for describing a method of providing, by a device, a combination of a live image 1900 and a virtual image 1920, 1940 for a second location in a VR space.
  • the device may acquire the photorealistic image 1900 and the virtual images 1920 and 1940 related to the second location.
  • the due diligence image 1900 of the second position represents an image of the second position.
  • the virtual images 1920 and 1940 related to the second location may include an advertisement image 1920 and a second virtual image 1940 for a service reflecting a user's taste.
  • the user may select the type of the second virtual image 1940 to be viewed together with the live image 1900. For example, a performance image, an image of a tourist attraction of a game hosting city, and the like may be included in the second virtual image 1940.
  • the device may display in advance a type of virtual image that the user may combine with the live image image 1900 to allow the user to select one of the displayed types of the virtual image.
  • the device may display discount information, point information, etc., which the user can receive by selecting a specific virtual image.
  • the device may display a supporter avatar 1910 representing a user on an image in which the photorealistic image 1900 and the virtual images 1920 and 1940 are combined.
  • the device may display the supporter avatar 1910 to move according to the user's gesture, thereby giving the user an effect as if the user is in the actual second position.
  • the device may display, along with the supporter avatar 1910 of the user, an avatar 1930 of another user who uses content related to a sporting event together with the user.
  • the device may transmit and receive information with another user's device in real time. For example, the device may generate the state information of the user based on the voice of the user, the gesture of the user, and provide the same to the device of another user.
  • the device receives other user's state information generated based on another user's voice, another user's gesture, etc., and displays the supporter avatar 1910 and the other user's avatar 1930 based on the received information. can do.
  • the user may select another user who uses the content related to the sports event together with the user based on the social network service (SNS) service provided by the device.
  • SNS social network service
  • FIG. 20 is a diagram for describing a method of providing, by a device, a service to a user at a third location in a VR space, according to an exemplary embodiment.
  • the device 100 may receive an image 2010 in which a live image and a virtual image of a third location are combined according to a gesture of a user.
  • the information about the position which can move according to the gesture of the user is set in advance.
  • the third position is assumed to be the souvenir shop 1732 described above with reference to FIG.
  • an image obtained by combining a photorealistic image of a product (eg, 2012) that a user can purchase in the souvenir shop 1732 and a virtual image representing a background of the souvenir shop may be acquired in the device 100. have.
  • the user of the device 100 may select a desired product (eg, 2012) while watching a product (eg, 2012) displayed on the third location.
  • a desired product eg, 2012
  • the device 100 may request the user to input authentication information for payment in order to execute a payment means for purchasing the product A 2012.
  • the device 100 may apply the virtual image of the product A 2012 to the supporter avatar before the user purchases the product A 2012 so that the user may experience the product A 2012 in advance. have.
  • the user may purchase product A 2012 directly or may purchase a virtual image for product A 2012.
  • the device 100 stores the virtual image for the merchandise A 2012 in the information of the supporter avatar, so that the user may later select another content about the sporting event. In this case, the stored virtual image may be loaded and applied to the supporter avatar.
  • FIG. 21 is a diagram for describing a supporter avatar 2110 provided by a device, according to an exemplary embodiment.
  • the device may generate the supporter avatar 2110 based on user information.
  • the user's information may include a profile including the user's gender, age, body information, occupation, and the like, and an image of the user.
  • identification information regarding the user's payment means may be included in the user's information.
  • the device may display the supporter avatar 2110 on an image in which the live image and the virtual image of the moved specific location are combined according to the gesture obtained from the user.
  • the method of displaying the supporter avatar 2110 may correspond to the method described above with reference to FIG. 19.
  • the device may apply the user's state in the VR space to the supporter avatar 2110.
  • the state of the user may be determined according to product information purchased by the user in the VR space, information of a sports team, and the like that the user supports.
  • the device may include a virtual image 2111 for a flag of a sports team that a user purchased in VR space, a virtual image 2113 for a trophy, virtual images 2115 and 2119 for a race suit, and a team scarf.
  • the virtual image 2117 may be applied to the supporter avatar 2110.
  • the virtual image represents a three-dimensional image of an object such as an actual product.
  • the device may display an image of the supporter avatar 610 according to the user's request even when the user does not use the content related to the sporting event.
  • the image of the supporter avatar 610 may be shared with other users through an application such as SNS.
  • FIG. 22 is a diagram for describing in detail a method of providing, by a device, content related to a sporting event according to an embodiment.
  • the device may acquire a gesture for requesting a change of location to the stadium 1740 described above with reference to FIG. 17. Accordingly, the device may display a combination of photorealistic images 2210, 2220, and 2240 and virtual images 2230 of the stadium 1740.
  • the device may acquire a photorealistic image 2210 of the stadium at a time corresponding to the location information of the seat table, based on the position information of the seat table in the stadium purchased by the user in advance. Also, the device may use the photo-realistic image 2220 at the first time point and the photo-realistic image 2240 at the second time point to correspond to the user's location information so that the user may view the game from various angles. ) Can be provided with.
  • the location of the user may be updated according to the changed location of the user.
  • the device when the device acquires a gesture that moves to the right from the user, the device may obtain a photorealistic image at a position moved by a predetermined distance in the right direction from the current viewpoint.
  • the device according to an embodiment may provide the game brokerage information regarding the position and the scoring situation of the player in the ongoing game as the virtual image 2230.
  • the virtual image 2230 may display an advertisement image provided by a sponsor sponsoring a game.
  • the device may display the live image and the virtual image of the other game together with the currently displayed image in a multi-screen form.
  • FIG. 23 is a diagram for describing a method of providing, by a device, history information of content related to a sports game viewed by a user.
  • the device may provide images 2320, 2330, 2340, 2350, 2360, and 2370 related to a sport game that the user has watched in the past to the user in the preset history space 2310.
  • the images 2320, 2330, 2340, 2350, 2360, and 2370 may be provided in at least one form of a still image or a moving image.
  • the device when a request for any one of images 2320, 2330, 2340, 2350, 2360, and 2370 related to a sports game that a user has watched is received, the device displays an image corresponding to the received request. can do.
  • the images 2320, 2330, 2340, 2350, 2360, and 2370 related to the sporting event may be images in which the virtual image and the live action image described above with reference to FIGS. 18 to 22 are combined.
  • the device may display the virtual image and the live image of the selected location. Accordingly, the user may move to a position where a game that the user has watched in the past is performed, and may be provided with an environment in which the user may experience the past memory again.
  • 24 is a diagram for describing a method of providing an artificial intelligence guide of a sporting event in a device, according to an exemplary embodiment.
  • the device may provide at least one guide 2410 or 2420 in the virtual space 2400 generated by combining a virtual image and a live image of a place where a sports game can be viewed.
  • the device may set the profile information and the preference information of the user as input values of the neural network to generate at least one guide 2410 or 2420 for providing the user with information about a sporting event.
  • the profile information of the user may include information about the nationality, race, age, gender, etc. of the user.
  • the preference information may include information about characteristics of the person that the user prefers.
  • the device may set a face, a body type, a key, a voice, etc. of the guide (eg, 2410) based on the user's profile and preference information.
  • the device may receive payment information for use of a copyright or a portrait right of a sports star, a singer, an actor, a competition mascot, or an animation hero of the corresponding item from a server managing the copyright or the portrait right.
  • the device provides the payment information to the user, and when the user's payment approval is completed, at least one guide 2410 or 2420 is generated based on the acquired characteristics such as a sports star, a singer, an actor, a competition mascot, and an animated character. can do.
  • the guide (eg, 2410) generated may provide a user of the device with information about a place and a sporting event to watch a sporting event.
  • FIG. 25 is a diagram for describing a payment system (hereinafter, referred to as a payment system) using a device 100 that provides a VR (Virtual Reality) space according to an embodiment.
  • a payment system hereinafter, referred to as a payment system
  • VR Virtual Reality
  • a payment system may include a device 100, a service providing server 2510, and a payment server 2520.
  • this is merely an example of a payment system, and the present invention is not limited through FIG. 25. That is, according to various embodiments of the present disclosure, the payment system may be configured differently from FIG. 25.
  • Each component of the payment system of FIG. 25 is generally connected via a network.
  • the device 100 may receive an image for providing a VR space from the service providing server 10.
  • an image provided to the device 100 from another service providing server 10 may include an item regarding at least one of goods and services.
  • the product may include various objects that can be traded through the goods, such as smart devices, cars, furniture, clothes and sports equipment.
  • the service may include a sports relay service, a health care service, a game service, and a community service.
  • the device 100 may receive a user input for selecting any one of at least one item displayed on the output image.
  • the user input may be in the form of a gesture.
  • the device 100 may detect a gesture of touching the exercise device A in a VR space that the user senses through the output image.
  • the user input may be obtained from an input means such as a touch pen or a smart phone, or may be obtained through a user's voice.
  • the device 100 may select one of at least one item displayed on the output image based on the received user input.
  • the device 100 may display information on the amount of money necessary to purchase the selected item.
  • the device 100 may transmit a purchase progress request to the service providing server 10.
  • the service providing server 2510 may receive identification information related to the payment means selected by the user of the device 100 from the device 100 and provide it to the payment server 2520.
  • the service providing server 2510 may connect the device 100 and the payment server 2520 so that the device 100 may provide identification information related to a payment method directly to the payment server 2520.
  • the payment server directly provides the payment completion information or the service providing server 2510.
  • the payment completion information may be provided to the device 100 through.
  • the payment completion information may include delivery information of the product, information on using the service, payment information, and the like, but this is only an example and the payment completion information is not limited to the above-described example.
  • FIG. 26 is a flowchart illustrating a payment method of a device 100 that provides a VR space, according to an exemplary embodiment.
  • the device 100 may receive a user input of selecting one of items related to at least one of goods and services displayed on the user interface of the VR space provided by the device 100.
  • the item may include identification information representing each of goods and services provided through the user interface of the VR space.
  • an item may include a product name, a service name, a thumbnail image of the service, a catalog of the product, and the like.
  • the item may be displayed in the form of any one of text, an image, and a video.
  • a user wearing the device 100 may select any one of items related to at least one. For example, a user wearing the device 100 may select a specific item by touching an area in which the specific item is displayed on the sensed VR image or speaking the name of the specific item through voice. However, this is only an example, and a method of selecting one of at least one item by a user wearing the device 100 is not limited to the above-described example.
  • the device 100 may transmit identification information related to the payment means and payment information of the item.
  • the device 100 may display, on the user interface of the VR space, cost information to be paid for using the selected item.
  • the cost information about the selected item may be received from one of the service providing server 2510 and the payment server 2520.
  • the device 100 may display a window for selecting any one of a plurality of payment means capable of paying for the selected item.
  • the user of the device 100 may select one of a plurality of payment means through the displayed window.
  • the device 100 may transmit identification information related to the selected payment means and payment information of the item.
  • the identification information related to the payment means may include a password, etc. preset by the user to approve the use of the payment means.
  • the payment information may include a kind of a product or a service corresponding to the item, shipping address information required to receive the product or use the service.
  • the device 100 may transmit identification information related to the payment means and payment information of the item to the payment server 2520 through the service providing server 2510 or directly to the payment server 2520.
  • the device 100 may display payment completion information on the user interface.
  • the device 100 when the identification information input by the user about the payment method matches the identification information of the pre-stored user, the device 100 may perform payment of the selected item. In addition, when the payment is completed, the device 100 may receive payment completion information from the payment server 2520. In this case, the payment completion information may be directly transmitted from the payment server 2520 to the device 100 or may be transmitted from the payment server 2520 to the device 100 through the service providing server 2510.
  • the device 100 may acquire a right to use a service provided by the service providing server 2510 or obtain a product.
  • the product may be an actual product and a virtual product provided in the VR space.
  • the virtual product may include either a real image of the product or a graphic image of the product.
  • FIG. 27 is a diagram for describing a method of paying, by the device 100, a user's right to use a service available through a user interface of a VR space.
  • the device 100 may provide various types of services to a user through a VR space.
  • the device 100 may implement at least one place in the VR space in order to provide a service to the user.
  • the device 100 may change the VR space to a stadium where a sporting event is held.
  • the device 100 may change the VR space to an art gallery, a performance hall, or the like in order to provide a user with an artwork viewing service or a musical viewing service.
  • the device 100 may output a combined image of a live image and a graphic image of a place where a service is provided.
  • the user may select one of the at least one service by selecting an item for each of the places where the at least one service displayed on the user interface of the VR space is provided.
  • the user may display an item 2710 for a artwork viewing service, an item 2720 for a musical viewing service, and an item 2730 for viewing a sports event, which the device 100 displays on a user interface of a VR space.
  • the device 100 may receive information necessary to implement a place where the selected service is provided from the service providing server 2510. Meanwhile, in order for the device 100 to obtain information necessary for implementing a place where a service is provided from the service providing server 2510, the device 100 may have to pay a cost designated by the service providing server 2510.
  • the device 100 may display the cost information to be paid for using the selected service on the user interface.
  • the user may select any one of a plurality of payment means that can be provided through the device 100.
  • the device 100 may display a window on which the user's identification information for the selected payment means can be input on the user interface in the VR space.
  • the device 100 may transmit the input identification information to the payment server 2520 through the service providing server 2510 or directly to the payment server 2520. As it is confirmed that the identification information of the user received from the payment server 2520 matches the identification information of the preset user, payment for the selected service may be completed.
  • the device 100 may set the VR space as a place where the selected service may be provided. For example, when the user pays for a sports relay service, the device 100 may implement a VR space by outputting a combination of live image and graphic image of a stadium where a sporting event is held.
  • FIG. 28 is a diagram for describing a method of paying, by the device 100, a product that can be purchased through a user interface of a VR space, according to an embodiment.
  • the device 100 may display an item regarding at least one product that can be purchased through the VR space.
  • the device 100 may implement the souvenir hall 2800 in the VR space in the stadium where the sports relay service is provided.
  • the souvenir hall 2800 implemented in the VR space of the device 100 may display an item (eg, 2810) regarding at least one product that can be purchased by the user.
  • the item (for example, 2810) relating to the at least one product may be a photorealistic image, a graphic image, or the like of the at least one product, but this is merely an example, and the item regarding the product is not limited thereto.
  • the user of the device 100 may select any one of items (eg, 2810) related to the displayed at least one product. For example, a user of the device 100 may touch and select an item 2810 related to the first product.
  • items eg, 2810
  • the device 100 may transmit information about the item to the service providing server 2510.
  • the service providing server 2510 includes information about a payment amount of the first product, a purchase option, and a payment method available to the user. Payment information may be provided to the device 100.
  • the purchase option is information indicating whether the purchase of the virtual goods and the actual goods is about.
  • the device 100 may display the received payment information on the user interface of the device 100.
  • the user of the device 100 may determine whether to purchase a product based on the displayed payment information.
  • one of a plurality of payment means may be selected, and identification information about the selected payment means may be input to the device 100.
  • the device 100 may provide input identification information of the user to the payment server 2520 through the service providing server 2510 or directly to the payment server 2520.
  • 29 is a flowchart illustrating a method for the device 100 to make a payment for a product or a service using an avatar in a VR space, according to an exemplary embodiment.
  • the device 100 may input identification information of the user corresponding to at least one payment means that can be provided by the device, to attribute information of the avatar on the VR space created based on the user's profile.
  • the device 100 may provide an avatar representing a user in a VR space.
  • the avatar may be produced based on a profile of the user's age, gender, occupation, and the like.
  • the avatar may be produced based on the image of the user.
  • the device 100 may provide a profile of the user and an image of the user to the service providing server 10 to receive an avatar of the user generated by the service providing server 10.
  • the device 100 may generate an avatar of the user based on the profile of the user and the image of the user through the avatar generating application provided from the service providing server 10.
  • the device 100 may input identification information of a payment means for paying a cost of a product or a service to attribute information of an avatar for convenience of payment in a VR space. For example, when the payment means is selected as the xx pay, the device 100 may input identification information of the user for the xx pay to the attribute information of the avatar.
  • the device 100 may receive a user input of selecting one of items related to at least one of goods and services displayed on the user interface of the VR space provided by the device 100.
  • step S2920 may correspond to step S2610 described above with reference to FIG. 26.
  • the device 100 may transmit attribute information of the avatar and payment information of the item to a first server that provides at least one of goods and services.
  • the first server may be the service providing server 2510 or the payment server 2520 described above with reference to FIG. 1.
  • the device 100 may access an item related to at least one of goods and services using an avatar created based on the user's profile in the VR space.
  • step S2920 in order to receive the goods or services corresponding to the item selected by the user has to pay a corresponding cost.
  • the device 100 may replace the payment process by transmitting attribute information of the avatar and payment information of the item to the first server without going through a separate payment process.
  • the first server that receives the attribute information of the avatar including the information of the payment means and the identification information of the payment means from the device 100 determines that the user approves the payment, and thus the product corresponding to the selected item.
  • the use authority of the service may be granted to the device 100.
  • the device 100 may display settlement completion information on the user interface.
  • step S2940 may correspond to step S2630 described above with reference to FIG. 26.
  • FIG. 30 is a diagram for describing a method of performing, by a device, a payment for a product or a service using an avatar in a VR space, according to an exemplary embodiment.
  • the attribute information of the avatar 3010 generated based on the profile of the user of the device 100 may include gender, age, occupation, and identification information of the payment means 1 of the user.
  • the device 100 may display information about payment amount and quantity of the flag 3020 and the trophy 3030 selected by the user, and a request for purchase by the user.
  • the payment information including the information indicating whether the virtual goods or the virtual goods may be transmitted to the first server along with the attribute information.
  • the first server may be a service providing server 2510 or a payment server 2520.
  • the device 100 transmits the payment information and the attribute information to the service providing server 2510, the transmitted payment information and the attribute information may be transmitted to the payment server 2520.
  • a payment process for the selected flag 3020 and the trophy 3030 may be performed.
  • the first server may determine that the purchase of the flag 3020 and the trophy 3030 has been approved by the user, and allow the device 100 to use the flag 3020 and the trophy 3030. In the present embodiment, it is assumed that the user of the device 100 requests the purchase of the virtual goods of the flag 3020 and the trophy 3030.
  • an avatar displayed on the VR space provided by the device 100 includes a purchased flag 3020 and a trophy 3030. ) May be applied.
  • 31 is a flowchart for describing a method of paying at least one of goods and services requested by the device 100 by the service providing server 2510, according to an exemplary embodiment.
  • the service providing server 2510 may provide an item regarding at least one of goods and services on a user interface of a VR space provided by the device 100.
  • the service providing server 2510 may transmit an item regarding at least one item representing information on a service available in a VR space and information on a product to be purchased, to the device 100.
  • the device 100 may display an item regarding at least one received on the user interface of the VR space.
  • the service providing server 2510 may receive a user input from the device 100 that selects any one of items provided on at least one of the provided items.
  • the service providing server 2510 may provide the device 100 with information about a cost to be paid for using a product or a service corresponding to the item selected based on the user input.
  • a predetermined cost should be paid for the goods or services. Accordingly, the service providing server 2510 may transmit information about the cost to be paid to the device 100.
  • the service providing server 2510 may receive identification information related to a payment means and payment information of an item from the device 100.
  • the service providing server 2510 may include payment information including identification information of a payment method selected by the user from the device 100 and information about at least one of a quantity and a use period of a product or service that the user wants to use. Can be received.
  • the service providing server 2510 may transmit identification information and payment information to the payment server 2520 for payment of the selected product or service.
  • the service providing server 2510 may approve the right to use the product or the service to the device 100 as the payment for the product or the service is completed.
  • the service providing server 2510 may receive information indicating that payment of a product or service selected by a user is completed from the payment server 2520 when the payment is made by the payment server 2520. Accordingly, the service providing server 10 may authorize the user to use the goods or services.
  • the user may approve the user's right to use the product or service.
  • 32 and 33 are block diagrams of a device 100 providing a UI in a VR space according to an embodiment.
  • the device 100 may include a sensing unit 110, a processor 120, an output unit 130, and a communication unit 160.
  • a sensing unit 110 may include a processor 120, an output unit 130, and a communication unit 160.
  • the device 100 may be implemented by more components than the illustrated components, and the device 100 may be implemented by fewer components.
  • the device 100 may include A / A in addition to the sensing unit 110, the processor 120, the output unit 130, and the communication unit 160.
  • the V input unit 140, the user input unit 150, and the memory 170 may be further included.
  • the device 100 illustrated in FIGS. 32 and 33 may perform operations of the device described above with reference to FIGS. 1 to 31.
  • the sensing unit 110 may detect at least one of a state of the device 100, a state around the device 100, and a state of a user wearing the device 100, and transmit the detected information to the processor 120. have.
  • the sensing unit 110 may obtain identification information of a user wearing the device 100. In addition, the sensing unit 110 may track the movement of the user wearing the device 100.
  • the sensing unit 110 may acquire biometric information of a user wearing the device 100 at a predetermined cycle. In addition, the sensing unit 110 may detect a state change of the user while displaying the UI in the VR space.
  • the sensing unit 110 may obtain an input of a user wearing the device 100.
  • the sensing unit 110 may detect a user's gesture and may obtain a user's input.
  • the sensing unit 110 may receive a content request for a sports event from the user.
  • the sensing unit 110 may receive a user input for selecting any one of items related to at least one of goods and services displayed on the user interface of the VR space.
  • the sensing unit 110 may receive a user input for selecting any one of a plurality of payment means that can be provided by the device.
  • the sensing unit 110 may include a geomagnetic sensor 111, an acceleration sensor 112, a temperature / humidity sensor 113, an infrared sensor 114, a gyroscope sensor 115, a position sensor. (Eg, GPS) 116, barometric pressure sensor 117, proximity sensor 118, and RGB sensor (illuminance sensor) 119, but is not limited thereto.
  • the sensing unit 110 may further include a biosignal detection sensor capable of measuring a user's pulse, blood pressure, brain wave, and the like. Since functions of the respective sensors can be intuitively deduced by those skilled in the art from the names, detailed descriptions thereof will be omitted.
  • the processor 120 typically controls the overall operation of the device 100.
  • the processor 120 executes programs stored in the memory 170 to detect the sensing unit 110, the output unit 130, the A / V input unit 140, the user input unit 150, and the communication unit 160. ) Can be controlled overall.
  • the processor 120 may allocate a preset template UI for the user in the VR space based on the identification information obtained through the sensing unit 110.
  • the processor 120 may obtain, via the user input unit 150, profile information and information about an object that the user wants to place in the preset template UI from the user.
  • the processor 120 may generate an avatar of the user based on the profile information.
  • the processor 120 may generate a virtual object based on the information about the object and arrange the virtual object on the template UI.
  • the processor 120 may generate a UI for the user by arranging the avatar and the virtual object on the template UI.
  • the processor 120 may set the allocated UI such that at least one advertisement is displayed on one surface of the UI assigned to the user.
  • the processor 120 may generate the user's health information based on the biometric information obtained through the sensing unit 110.
  • the processor 120 may generate at least one of medical information and recommended sports information for the user based on the generated health information.
  • the processor 120 may search for information about an object as a search request for the object is received through the displayed search box.
  • the processor 120 may display at least one advertisement on the VR space through the output unit 130 while searching for information about the object.
  • the processor 120 may stop displaying the at least one advertisement.
  • the processor 120 may select at least one advertisement from among a plurality of advertisements based on a profile of the user and history information about at least one object previously requested to be searched.
  • the processor 120 may move an image constituting at least one advertisement in a predetermined direction.
  • the processor 120 changes the speed and the position at which the image projected through the lens having the preset angle of view is output through the output unit 130, thereby shifting the image constituting the at least one advertisement in the preset direction. You can move it.
  • the moving speed of the image may be faster as the end time of the search for the object gets closer.
  • the processor 120 may determine a place where a sporting event proceeds as a content request is received. In addition, the processor 120 may determine a location in a place where a sporting event corresponding to the user's gesture is performed.
  • the processor 120 may generate history information of at least one content watched by a user before providing content related to a sporting event.
  • the processor 120 may select a user who uses content related to a sporting event from among a plurality of other users connected through a user's SNS account.
  • the processor 120 may generate an avatar image representing the user based on the profile information of the user.
  • the processor 120 may select one of items related to at least one of goods and services based on a user input.
  • the processor 120 may input identification information of the user corresponding to at least one payment means that can be provided by the device 100 to avatar attribute information on the VR space created based on the profile of the user.
  • the output unit 130 is for outputting an audio signal, an image signal, or a vibration signal, and may include a display unit 131, a sound output unit 132, and the like.
  • the display unit 131 displays and outputs information processed by the device 100.
  • the display unit 131 may display a UI for the user generated by the processor 120.
  • the display unit 131 may display at least one category of objects that can be arranged in the template UI, and may display a plurality of objects included in the selected category as the user selects any one of the at least one category. .
  • the display unit 131 may display payment information necessary for purchasing the object.
  • the display unit 131 may display information on a product of the selected advertisement in the VR space when a selection input for selecting at least one advertisement displayed by the user is received while searching for information about the object.
  • the display unit 131 may display the search result and the information about the product of the selected advertisement together in the VR space.
  • the display unit 131 may display a live action image and a virtual image of a specific location at a place where a sporting event is performed according to a gesture of a user.
  • the display unit 131 may display a combination of at least one image photographed at a different point of time from the first viewpoint and the image photographed at the first viewpoint at a specific position.
  • the display unit 131 may display a virtual image of the object in the VR space.
  • the display unit 131 may display history information of the selected content on the VR space. .
  • the display unit 131 may display a live image and a virtual image of the selected location on the VR space.
  • the display unit 131 may display a virtual image representing another user on the VR space as the acceptance message for the invitation message transmitted to the device of the selected user is received. According to another example, the display unit 131 may combine the avatar image of the user together with the live image and the virtual image of the place where the sporting event is performed, and display the avatar image on the VR space.
  • the display unit 131 may display the payment completion information on the user interface.
  • the display unit 131 may display payment information about a cost required to access a target place selected based on a user's request among a plurality of places.
  • the display unit 131 may display the avatar generated based on the user's profile in the target space when payment of the cost required to access the target space is completed.
  • the display unit 131 may display a virtual image related to the selected item on the user interface of the VR space.
  • the display unit 131 and the touch pad form a layered structure and constitute a touch screen
  • the display unit 131 may be used as an input device in addition to the output device.
  • the sound output unit 132 outputs audio data received from the communication unit 160 or stored in the memory 170. In addition, the sound output unit 132 outputs a sound signal related to a function (for example, a call signal reception sound, a message reception sound, and a notification sound) performed in the device 100.
  • the sound output unit 132 may include a speaker, a buzzer, and the like.
  • the A / V input unit 140 is for inputting an audio signal or a video signal, and may include a camera (not shown) and a microphone (not shown).
  • the camera may obtain an image frame such as a still image or a moving image through an image sensor in a video call mode or a shooting mode.
  • the image captured by the image sensor may be processed by the processor 120 or a separate image processor (not shown).
  • the image frame processed by the camera may be stored in the memory 170 or transmitted to the outside through the communication unit 160. Two or more cameras may be provided according to the configuration aspect of the device 100.
  • the microphone receives an external sound signal and processes it into electrical voice data.
  • the device 100 may further include a lens (not shown).
  • the user of the device 100 may detect an image output from the display 931 through a lens.
  • the user input unit 150 means a means for a user to input data for controlling the device 100.
  • the user input unit 150 may receive a user input.
  • the user input unit 150 may be linked with the UI module 171 to receive a user input for selecting at least one of items displayed on the sensing area of each of the plurality of sensors.
  • this is only an example, and the type of the user input received by the user input unit 150 is not limited to the above-described example.
  • the communication unit 160 may include one or more components that allow communication between the device 100 and an external device (eg, an HMD).
  • the communicator 160 may include a short range communicator 161, a mobile communicator 162, and a broadcast receiver 163.
  • the mobile communication unit 162 transmits and receives a radio signal with at least one of a base station, an external terminal, and a server on a mobile communication network.
  • the wireless signal may include various types of data according to transmission and reception of a voice call signal, a video call call signal, or a text / multimedia message.
  • the broadcast receiving unit 163 receives a broadcast signal and / or broadcast related information from the outside through a broadcast channel.
  • the broadcast channel may include a satellite channel and a terrestrial channel. According to an implementation example, the device 100 may not include the broadcast receiver 163.
  • the memory 170 may store a program for processing and controlling the processor 120, and may store input / output data.
  • Programs stored in the memory 170 may be classified into a plurality of modules according to their functions. For example, the programs stored in the memory 170 may be classified into the UI module 171 and the touch screen module 172.
  • the UI module 171 may provide a specialized UI, GUI, and the like, which are linked to the device 100 for each application.
  • the touch screen module 172 may detect a touch gesture on the user's touch screen and transmit information about the touch gesture to the processor 120.
  • 34 is a block diagram of a service providing server 3400, according to an exemplary embodiment.
  • the service providing server 3400 may include a communication unit 3410, a processor 3420, and a memory 3430. However, not all illustrated components are essential components.
  • the service providing server 3400 may be implemented by more components than the illustrated component, and the service providing server 3400 may be implemented by fewer components.
  • the service providing server 3400 may perform the above-described operations with reference to FIGS. 25 to 31.
  • the communicator 3410 may provide an item regarding at least one of goods and services on a user interface of a VR space provided by the device 100.
  • the communicator 3410 may transmit, to the device 100, an item regarding at least one indicating information on a service available in a VR space and a product to be purchased.
  • the communicator 3410 may receive a user input from the device 100 that selects any one of items provided for at least one.
  • the communication unit 3410 may provide the device 100 with information about a cost to be paid for using a product or a service corresponding to the item selected based on the user input.
  • the communication unit 3410 may receive identification information related to a payment means and payment information of an item from the device 100. In addition, the communication unit 3410 may transmit identification information and payment information to the payment server 2520 that performs the payment of the selected product or service. However, this is only an example, and when the processor 3420 may make a payment, the payment process may be directly performed without having a separate payment server 2520.
  • the processor 3420 typically controls the overall operation of the service providing server 3400.
  • the processor 3420 may control a process related to payment of a service or a product provided through a VR space of the service providing server 3400 by executing programs stored in the memory 34230. have.
  • the processor 3420 may authorize the device 100 to use the product or the service as the payment for the product or the service is completed.
  • the memory 3430 may store a program for processing and controlling the processor 3420, and may store input / output data.
  • the memory 3430 may store information about at least one of a service and a product for providing through a user interface in a VR space.
  • the memory 3430 may store information about at least one item of a service and a product.
  • the memory 3430 may store identification information and payment information of a user of the device 100.
  • the device comprises a processor, a memory for storing and executing program data, a permanent storage such as a disk drive, a communication port for communicating with an external device, a user interface such as a touch panel, a key, a button and the like.
  • a user interface such as a touch panel, a key, a button and the like.
  • Methods implemented by software modules or algorithms may be stored on a computer readable recording medium as computer readable codes or program instructions executable on the processor.
  • the computer-readable recording medium may be a magnetic storage medium (eg, read-only memory (ROM), random-access memory (RAM), floppy disk, hard disk, etc.) and an optical reading medium (eg, CD-ROM). ) And DVD (Digital Versatile Disc).
  • the computer readable recording medium can be distributed over network coupled computer systems so that the computer readable code is stored and executed in a distributed fashion.
  • the medium is readable by the computer, stored in the memory, and can be executed by the processor.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

L'invention concerne un procédé de mise en place d'une interface d'utilisateur (UI) dans un espace de VR par un dispositif, le procédé comportant les étapes consistant à: recevoir, de la part d'un utilisateur, une demande portant sur un contenu se rapportant à une manifestation sportive; acquérir, en réaction à la demande reçue d'un contenu, une image réelle et une image virtuelle concernant un lieu où se tient la manifestation sportive; et afficher, en réaction à un geste de l'utilisateur, des images réelle et virtuelle correspondantes concernant un emplacement particulier sur le lieu où se tient la manifestation sportive.
PCT/KR2017/004365 2016-04-25 2017-04-25 Procédé, dispositif, et support d'enregistrement pour mettre en place une interface d'utilisateur dans un espace de vr WO2017188696A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR20160050121 2016-04-25
KR10-2016-0050121 2016-04-25

Publications (1)

Publication Number Publication Date
WO2017188696A1 true WO2017188696A1 (fr) 2017-11-02

Family

ID=60159900

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2017/004365 WO2017188696A1 (fr) 2016-04-25 2017-04-25 Procédé, dispositif, et support d'enregistrement pour mettre en place une interface d'utilisateur dans un espace de vr

Country Status (2)

Country Link
KR (4) KR101894021B1 (fr)
WO (1) WO2017188696A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108227927A (zh) * 2018-01-09 2018-06-29 北京小米移动软件有限公司 基于vr的产品展示方法、装置及电子设备

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102067493B1 (ko) * 2018-01-02 2020-01-17 주식회사 한글과컴퓨터 Vr 기반 프레젠테이션 문서의 디스플레이를 위한 hmd 장치 및 그 동작 방법
KR20190104753A (ko) * 2018-03-02 2019-09-11 주식회사 브이알이지이노베이션 가상 현실 시네마 서비스 제공 방법 및 이를 위한 장치
KR102084970B1 (ko) * 2018-05-02 2020-03-05 (주)포트러시 가상현실 관람 방법 및 가상현실 관람 시스템
KR102346681B1 (ko) * 2019-01-10 2022-01-03 제이에스씨(주) Vr과 ar을 기반으로 하는 실감체험형 헬스케어 서비스 시스템
KR102389335B1 (ko) * 2019-04-01 2022-04-20 주식회사 케이티 복수의 방송 채널의 영상을 표시하는 장치 및 방법
KR102298101B1 (ko) * 2019-07-31 2021-09-02 박준영 성인용품 판매시스템 및 그 시스템의 구동방법
KR102272503B1 (ko) * 2020-09-18 2021-07-02 주식회사 메이크잇 헤드 마운트 디스플레이 장치를 통해 금융 상품 거래 수행 방법 및 시스템
KR102272841B1 (ko) * 2020-11-27 2021-07-06 주식회사 비욘드테크 360° 중계영상 서비스 제공을 위한 중계영상 제공 및 과금 시스템 및 그 방법
TWI802909B (zh) * 2021-06-15 2023-05-21 兆豐國際商業銀行股份有限公司 金融交易系統及其操作方法
WO2023043012A1 (fr) * 2021-09-15 2023-03-23 사회복지법인 삼성생명공익재단 Procédé de biorégulation utilisant un contenu d'images, programme informatique et système
KR102627728B1 (ko) * 2021-11-02 2024-01-23 주식회사 엘지유플러스 메타버스 개인 맞춤형 콘텐츠 생성 및 인증 방법 및 그를 위한 장치 및 시스템
KR102654350B1 (ko) * 2023-04-28 2024-04-03 주식회사 젭 블록체인 기록 데이터 기반 메타버스 공간 내 특정 영역 출입 제어 방법 및 시스템

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140035861A (ko) * 2013-11-06 2014-03-24 엘지전자 주식회사 헤드 마운트 디스플레이를 위한 사용자 인터페이스 제공 장치 및 방법
KR20140065180A (ko) * 2012-11-21 2014-05-29 한국전자통신연구원 실시간 방송 콘텐츠 기반의 체험형 콘텐츠 제공 장치 및 방법
KR20150106772A (ko) * 2014-03-12 2015-09-22 삼성전자주식회사 Hmd 장치를 통하여 가상 이미지를 디스플레이하는 시스템 및 방법
KR20160002681A (ko) * 2013-01-16 2016-01-08 인썸(인스티튜트 내셔날 드 라 싼테 에 드 라 리셰르셰메디칼르) 골격 성장 지연 질환의 예방 또는 치료에 사용하기 위한 용해성 섬유아세포 성장 인자 수용체 3(fgr3) 폴리펩티드
US20160026253A1 (en) * 2014-03-11 2016-01-28 Magic Leap, Inc. Methods and systems for creating virtual and augmented reality

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20020074349A (ko) * 2001-03-20 2002-09-30 엘지전자주식회사 홈페이지 제작을 위한 서비스 시스템 및 이 시스템의 운영방법
KR20020076973A (ko) * 2001-03-31 2002-10-11 데이타박스(주) 뉴럴 네트워크 및 퍼지 시스템을 이용한 인터넷 가상 요리분석 시스템 및 방법
KR100408989B1 (en) * 2002-10-08 2003-12-11 Robopia Co Ltd Running machine and operating system and method thereof
KR20060134290A (ko) * 2005-06-22 2006-12-28 원용진 웹기반 정보검색수단의 발동방법 및 웹기반 정보검색수단의발동시스템
KR100953826B1 (ko) * 2007-08-14 2010-04-20 광주과학기술원 사용자 건강 관리 기능을 가지는 휴대용 단말기 및 이를 이용한 사용자 건강 관리 방법
KR100952394B1 (ko) * 2007-12-26 2010-04-14 에스케이커뮤니케이션즈 주식회사 가상현실 서비스를 위한 공간 관리 방법
JP2011039860A (ja) * 2009-08-13 2011-02-24 Nomura Research Institute Ltd 仮想空間を用いる会話システム、会話方法及びコンピュータプログラム
KR20110131680A (ko) * 2010-05-31 2011-12-07 영남이공대학 산학협력단 3차원 가상현실 기술을 이용한 가상 인터넷 쇼핑몰 서비스 시스템
KR101505060B1 (ko) * 2010-08-24 2015-03-26 한국전자통신연구원 가상 현실 연동 서비스 제공 시스템 및 그 방법
KR20120075565A (ko) * 2010-12-15 2012-07-09 고스트리트(주) 증강현실을 이용한 모바일 스포츠 가이드 시스템 및 방법
KR20140135276A (ko) * 2013-05-07 2014-11-26 (주)위메이드엔터테인먼트 게임 스크린에 대한 사용자의 제스쳐 입력을 처리하는 장치 및 방법
JP2015177403A (ja) * 2014-03-17 2015-10-05 セイコーエプソン株式会社 頭部装着型表示装置および頭部装着型表示装置の制御方法
KR101569465B1 (ko) * 2013-07-05 2015-11-17 서용창 메시지 전송 방법, 메시지 박스 판매 방법 및 이를 위한 프로그램을 기록한 컴퓨터 판독 가능한 기록매체
KR101517436B1 (ko) * 2013-08-30 2015-05-06 주식회사 지스푼 증강현실제공방법 및 증강현실제공시스템

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140065180A (ko) * 2012-11-21 2014-05-29 한국전자통신연구원 실시간 방송 콘텐츠 기반의 체험형 콘텐츠 제공 장치 및 방법
KR20160002681A (ko) * 2013-01-16 2016-01-08 인썸(인스티튜트 내셔날 드 라 싼테 에 드 라 리셰르셰메디칼르) 골격 성장 지연 질환의 예방 또는 치료에 사용하기 위한 용해성 섬유아세포 성장 인자 수용체 3(fgr3) 폴리펩티드
KR20140035861A (ko) * 2013-11-06 2014-03-24 엘지전자 주식회사 헤드 마운트 디스플레이를 위한 사용자 인터페이스 제공 장치 및 방법
US20160026253A1 (en) * 2014-03-11 2016-01-28 Magic Leap, Inc. Methods and systems for creating virtual and augmented reality
KR20150106772A (ko) * 2014-03-12 2015-09-22 삼성전자주식회사 Hmd 장치를 통하여 가상 이미지를 디스플레이하는 시스템 및 방법

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108227927A (zh) * 2018-01-09 2018-06-29 北京小米移动软件有限公司 基于vr的产品展示方法、装置及电子设备
CN108227927B (zh) * 2018-01-09 2021-07-23 北京小米移动软件有限公司 基于vr的产品展示方法、装置及电子设备

Also Published As

Publication number Publication date
KR20170121719A (ko) 2017-11-02
KR101894022B1 (ko) 2018-08-31
KR20170121721A (ko) 2017-11-02
KR20170121720A (ko) 2017-11-02
KR101981774B1 (ko) 2019-05-27
KR101894021B1 (ko) 2018-08-31
KR20170121718A (ko) 2017-11-02

Similar Documents

Publication Publication Date Title
WO2017188696A1 (fr) Procédé, dispositif, et support d'enregistrement pour mettre en place une interface d'utilisateur dans un espace de vr
WO2017111234A1 (fr) Procèdè pour la commande d'un objet par un dispositif èlectronique et dispositif èlectronique
WO2016048102A1 (fr) Procédé d'affichage d'image effectué par un dispositif comportant un miroir commutable et ledit dispositif
WO2017069324A1 (fr) Terminal mobile et procédé de commande associé
EP3198376A1 (fr) Procédé d'affichage d'image effectué par un dispositif comportant un miroir commutable et ledit dispositif
WO2016195160A1 (fr) Terminal mobile
WO2016108660A1 (fr) Procédé et dispositif pour commander un dispositif domestique
WO2015137743A1 (fr) Procédé de gestion de contenu, et serveur en nuage correspondant
WO2014017759A1 (fr) Procédé et terminal mobile permettant d'afficher des informations, procédé et dispositif d'affichage permettant de fournir des informations, et procédé et terminal mobile permettant de générer un signal de commande
WO2014119884A1 (fr) Procédé et système d'affichage d'objet et procédé et système de fourniture de cet objet
WO2015016622A1 (fr) Procédé et dispositif électronique pour partager une carte d'images
WO2015156461A1 (fr) Terminal mobile et son procédé de commande
WO2016018086A1 (fr) Système et procédé de gestion de métadonnées
WO2016089079A1 (fr) Dispositif et procédé pour générer en sortie une réponse
WO2015160039A1 (fr) Procédé de commande d'un dispositif d'affichage d'images
AU2017346260B2 (en) Electronic device and computer-readable recording medium for displaying images
WO2017018602A1 (fr) Terminal mobile et procédé de commande correspondant
WO2018074768A1 (fr) Procédé d'affichage d'image et dispositif électronique associé
WO2015142136A1 (fr) Procédé pour collecter des informations multimédias et dispositif associé
WO2022191542A1 (fr) Procédé de fourniture de service d'entraînement à domicile et dispositif d'affichage mettant en œuvre ledit procédé de fourniture de service d'entraînement à domicile
WO2016032113A1 (fr) Terminal mobile et son procédé de commande
WO2017164656A2 (fr) Dispositif d'affichage et son procédé de fonctionnement
WO2017018561A1 (fr) Système de commande d'espace d'exposition et procédé de commande d'espace d'exposition
AU2014213221B2 (en) Method and system for displaying object, and method and system for providing the object
WO2019039632A1 (fr) Procédé et dispositif pour connecter des terminaux d'utilisateur en tant que groupe et fournir un service comprenant des contenus associés au groupe

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17789880

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17789880

Country of ref document: EP

Kind code of ref document: A1