KR20170121718A - Method and device for providing user interface in the virtual reality space and recordimg medium thereof - Google Patents

Method and device for providing user interface in the virtual reality space and recordimg medium thereof Download PDF

Info

Publication number
KR20170121718A
KR20170121718A KR1020170053229A KR20170053229A KR20170121718A KR 20170121718 A KR20170121718 A KR 20170121718A KR 1020170053229 A KR1020170053229 A KR 1020170053229A KR 20170053229 A KR20170053229 A KR 20170053229A KR 20170121718 A KR20170121718 A KR 20170121718A
Authority
KR
South Korea
Prior art keywords
user
information
space
template
displaying
Prior art date
Application number
KR1020170053229A
Other languages
Korean (ko)
Other versions
KR101981774B1 (en
Inventor
장부다
이정
한명숙
Original Assignee
장부다
한명숙
이정
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 장부다, 한명숙, 이정 filed Critical 장부다
Publication of KR20170121718A publication Critical patent/KR20170121718A/en
Application granted granted Critical
Publication of KR101981774B1 publication Critical patent/KR101981774B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

Acquires identification information from a user of the device, allocates a predetermined template UI for the user in the VR space based on the obtained identification information, and transmits profile information from the user to the object to be placed in the predetermined template UI A device for creating a UI for a user provides a UI in a VR space by placing an object created based on information on an avatar and an object of a user created based on the profile information on a template UI A method is disclosed.

Description

TECHNICAL FIELD [0001] The present invention relates to a method and a device for providing a user interface in a VR space, a device and a recording medium,

The disclosed embodiments relate to a method for providing a user interface in a VR space, a device for providing a user interface in the VR space, and a recording medium on which a program for performing a method for providing a user interface in the VR space is recorded.

In recent years, various technologies have been developed to provide images for users to experience virtual reality (VR). Such virtual reality experience techniques can be realized through a device such as an HMD (Head Mount Display) that allows a user to experience a virtual reality by distorting an image output through a lens according to a user's visual field .

On the other hand, with the development of devices capable of experiencing virtual reality, interest in services that can be provided to users in the virtual reality world is increasing. Since virtual reality is a space where users can indirectly experience experiences that are difficult to experience in the real world, it is necessary to provide an environment that can be optimized for individual users.

There is provided a method, device, and recording medium for providing a user interface in which a variety of services required by a user are freely implemented in a physical reality space by implementing a user's desired environment in a VR space.

A method for a device according to an embodiment to provide a user interface (UI) in a VR space includes: obtaining identification information from a user of the device; Assigning a predetermined template UI to the user in the VR space based on the obtained identification information; Receiving profile information from a user and information about an object to be placed on a predetermined template UI by a user; And creating a UI for the user by arranging the generated virtual object on the template UI based on the information about the avatar and the object of the user created based on the profile information.

A method according to an embodiment for providing a user interface (UI) in a VR space comprises: displaying at least one category of objects that can be placed in a template UI; And displaying the plurality of objects included in the selected category as the user selects one of the at least one category.

A method of providing a UI (user interface) in a VR space according to an exemplary embodiment includes displaying payment information necessary for purchasing an object as information about the object is received; And receiving an input relating to the payment means corresponding to the payment information from the user.

A method of providing a user interface (UI) in a VR space according to one embodiment includes displaying at least one advertisement on one side of the assigned UI; And

And displaying the web page associated with the selected advertisement on the assigned UI when any one of the displayed at least one advertisement is selected.

A method of providing a user interface (UI) in a VR space includes: acquiring biometric information of a user wearing the device according to a predetermined period; And generating health information of the user based on the obtained biometric information.

A method for providing a user interface (UI) in a VR space according to an exemplary embodiment includes generating at least one of medical information and recommended sports information for a user based on generated health information; And displaying the generated information on the assigned UI.

A method of providing a UI (user interface) in a VR space according to an exemplary embodiment includes: detecting a change in a user's state while displaying a UI in a VR space; And changing the user's avatar according to the detected change.

A device for providing a user interface (UI) in a VR space according to an exemplary embodiment includes: a sensing unit for acquiring identification information from a user of the device; Assigns a predetermined template UI to the user in the VR space based on the obtained identification information, receives profile information from the user and information about the object to be placed in the predetermined template UI by the user, A processor for creating a UI for a user by placing a virtual object generated based on information on an avatar and an object of the created user on the template UI; And an output unit for displaying a UI for the generated user.

In a device that provides a UI in a VR space according to an embodiment, the output unit displays at least one category of objects that can be placed in the template UI, and as the user selects one of the at least one category, And displays a plurality of objects included in the object.

In a device for providing a UI in a VR space according to an exemplary embodiment, an output unit displays payment information required for purchase of an object as information about the object is received, And a user input section for receiving the input.

In a device providing a UI in a VR space according to an exemplary embodiment, a processor sets at least one advertisement to be displayed on one side of an assigned UI, and the output unit displays, when one of the displayed at least one advertisement is selected , And displays the web page associated with the selected advertisement on the assigned UI.

In a device providing a UI in a VR space according to an exemplary embodiment, the sensing unit may acquire biometric information of a user wearing the device according to a predetermined period, and the processor may acquire biometric information of the user based on the acquired biometric information, .

In a device for providing a UI in a VR space according to an exemplary embodiment, the processor generates at least one of medical information and recommended sports information for the user based on the generated health information, and the output unit outputs the generated information And displays it on the assigned UI.

In a device providing a UI in a VR space according to an exemplary embodiment, the sensing unit senses a change in the user's state while displaying the UI in the VR space,

The processor changes the avatar of the user according to the detected change.

1 is a conceptual diagram for explaining a method of providing a UI (User Interface) in a VR space according to an embodiment of the present invention.
2 is a flowchart illustrating a method of providing a UI in a virtual space according to an embodiment of the present invention.
3 is a diagram for explaining a UI of a user provided by a device according to an exemplary embodiment.
4 is a diagram for explaining how a device according to an embodiment provides sports training through a user's UI.
5 is a diagram illustrating a method of providing a medical service through a UI of a device according to an exemplary embodiment.
6 is a diagram illustrating a method of providing a service to a plurality of users through a UI of a user according to an embodiment of the present invention.
7 is a diagram illustrating a method of providing a user's history information in a VR space through a UI of a user according to an embodiment of the present invention.
8 is a diagram illustrating a method for a device according to an exemplary embodiment of the present invention to set a user's UI through other external servers.
9 and 10 are block diagrams of a device providing a UI in a VR space in accordance with one embodiment.

The terms used in this specification will be briefly described and the present invention will be described in detail.

While the present invention has been described in connection with what is presently considered to be the most practical and preferred embodiment, it is to be understood that the invention is not limited to the disclosed embodiments. Also, in certain cases, there may be a term selected arbitrarily by the applicant, in which case the meaning thereof will be described in detail in the description of the corresponding invention. Therefore, the term used in the present invention should be defined based on the meaning of the term, not on the name of a simple term, but on the entire contents of the present invention.

When an element is referred to as "including" an element throughout the specification, it is to be understood that the element may include other elements, without departing from the spirit or scope of the present invention. Also, the terms "part," " module, "and the like described in the specification mean units for processing at least one function or operation, which may be implemented in hardware or software or a combination of hardware and software .

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings so that those skilled in the art can easily carry out the present invention. The present invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. In order to clearly illustrate the present invention, parts not related to the description are omitted, and similar parts are denoted by like reference characters throughout the specification.

1 is a conceptual diagram for explaining a method of providing a UI (User Interface) in a VR space according to an embodiment of the present invention.

The device 10 can implement a VR environment by outputting an image on a display through a prefabricated lens in order to make the user feel as if they are located in a new space different from the space where the user is currently located.

The device 10 according to one embodiment can provide a UI by which a user can directly create a space so that a user can own a space in the VR space. For example, the device 10 may provide the user with a predetermined template UI 100. Here, the predetermined template UI 100 may include icons 110, 120, 130, 140, and 150 necessary for the user to create his / her own space or control the VR space. The icon icon 110, the home icon 120, the favorite icon 130, the setting icon 140, and the preview icon 150 (FIG. 1) are displayed on the icons 110, 120, 130, 140, ) May be included.

The toolbar icon 110 may include icons needed to place, remove, and edit objects desired by the user on the template UI 100. [ The home icon 120 may move a user in another location to the home in the VR space. Here, the groove means a space created by the user in the VR environment. There may be various places on the VR space provided by a web server (not shown) or an external device connected to the device 10. When the user is located at a specific place and wants to move to a home, which is a place unique to the user in the VR space created by the user, the user can move to the home by touching the home icon 120.

The favorite icon 130 may include location information on the web of a place that the user mainly visits, among various places located on the VR space. For example, the user can store the address information of the first place and the second place frequently visited by the user in the VR space in the favorite icon 130 and select the favorite icon 130 in another place, It is possible to more easily move to the place where it is located.

The setting icon 140 may include profile information of the user, profile information of other users connectable to the device 10 of the user, and community information on the VR space in which the user is included. In addition, the user can select the setting icon 140 to update the profile information of the previously stored user, the profile information of other users, and the community information in the VR space.

The preview icon 150 is a window showing preview results of search results of other places, objects, and the like in the VR space searched by the user. The device 100 can display the search result that the user has searched through the preview icon 150 to thereby identify and display the VR space in which the user is located and the search result.

The device 10 according to an exemplary embodiment may create a UI for a user by arranging virtual objects selected by the user on the predetermined template 100. In addition, the device 100 may create a UI for a user by arranging not only a virtual object but also a user's avatar together on a template UI.

When implementing the VR space, the device 10 may display the created user's UI first, and provide an environment in which the user can enjoy various entertainment activities in the home created by himself in the VR space. On the other hand, by arranging various virtual objects which are not possessed in the real world on the UI created by the user, the user can feel surrogate satisfaction through this. A concrete method of setting the UI of the user by arranging the virtual object on the preset template 100 by the user will be described later with reference to FIG.

Meanwhile, the device 10 according to one embodiment may be implemented in various forms. For example, the device 100 described herein may be a mobile phone, a smart phone, a laptop computer, a tablet PC, an electronic book terminal, a digital broadcast terminal, a personal digital assistant (PDA) (HMD), a portable multimedia player, a navigation system, a smart TV, a smart car, a CE (Consumer Electronics) device (e.g., a refrigerator with a display panel and an air conditioner), and a head mounted display (HMD).

2 is a flowchart illustrating a method of providing a UI in a virtual space according to an embodiment of the present invention.

In step S210, the device obtains the identification information from the user of the device.

A device according to an exemplary embodiment may display a message requesting identification information to a user before setting a home space, which is a space unique to a user, in a VR space. The user can input identification information of a user by using at least one of a touch, a gesture, and a voice as a message requesting identification information is displayed in the VR space. Here, the identification information of the user may include the ID and the password of the user, but this is only an embodiment, and the identification information is not limited to the above-described example.

In step S220, the device allocates a predetermined template UI for the user in the VR space based on the obtained identification information.

The device 10 according to one embodiment may assign a predetermined template UI to the user as the identification information is acquired. The device 10 can exchange information in real time with a home management server that sets and manages a home space for each of a plurality of users. The device 10 may directly allocate a predetermined template UI or may receive a template UI allocated by the home management server based on the identification information of the user.

In step S230, the device 10 receives profile information from the user and information about the object to be placed in the template UI set by the user.

Here, the profile information of the user may include information on the user's occupation, age, sex, body information, photographs, and the like. Objects may also include objects such as furniture, vehicles, electronic devices, exercise equipment and clothing, plants such as flowers, trees, and animals such as dogs, cats, birds, and the like.

The device 10 according to an exemplary embodiment may detect a voice, a gesture, a touch signal, etc. of a user and receive information on an object to be arranged by the user. For example, the device 10 may detect a voice signal "A car" if the user says "A car ", and determine that the object the user intends to place in the template UI is A car. According to another example, the device 10 may provide an image of various objects included in the category when the user inputs the category of the object through the search window on the VR space. For example, if the user enters the word mid-sized car, the device 10 can display images of cars included in the mid-sized car. The user can select any one of the images of the cars displayed through the device 10 through the touch input or the gesture.

In step S240, the device 10 creates a UI for the user by arranging the virtual object generated based on the information on the avatar and the object of the user created based on the profile information, on the template UI.

The device 10 according to one embodiment may generate the avatar of the user in the VR space based on the profile information. For example, the device 10 can generate avatars similar in appearance and appearance to the user based on the user's body information, age, and photographs included in the profile information. According to another example, the device 10 may acquire information about the external form desired by the user, in addition to the profile information of the user, and generate the avatar of the user based on the information.

Further, the device 10 can place the generated avatar on the template UI. The user can control the position, movement and posture of the avatar on the template UI through touch, gesture, and voice input.

The device 10 according to an embodiment may acquire a virtual object corresponding to an object to which a user desires to be placed, as information regarding the object the user desires to be placed on the template UI is obtained. Here, a virtual object represents a three-dimensional image of an object in virtual space.

On the other hand, the device 10 may request the user to settle the virtual object to acquire the virtual object. For example, the device 10 may request a virtual object from a seller's server that sells a virtual object corresponding to an object selected by the user. In the server of the seller who has received the request, the device 10 is requested to use a payment means capable of paying the calculated amount for the virtual object, and the device 10 pays the requested amount using the predetermined payment means . As the money is paid, the device 10 can acquire the virtual object.

However, this is an embodiment only, and the device 10 may establish a session with the seller's server and a predetermined payment server to pay for the acquisition of the virtual object.

The device 10 according to an exemplary embodiment may store a template UI in which an avatar and a virtual object are arranged as a UI for a user. Information on the UI of the user stored by the device 10 may be transmitted to the home management server. The home management server may store information on the UI of the received user matching the user's identification information. Accordingly, when the user later accesses the VR space and inputs the identification information of the user, the UI of the stored user can be received from the home management server.

3 is a diagram for explaining a UI 300 of a user provided by the device 10 according to an embodiment.

Referring to FIG. 3, the device 10 may load and display the UI 300 of the user in the VR space previously set by the user, as the identification information of the user is input. Here, the UI 300 of the user may be stored in the device 100 or may be stored in the home management server.

The UI 300 of the user corresponds to the toolbar icon 110, the home icon 120, the favorite icon 130, the setting icon 140, and the preview icon 150 described above with reference to FIG. 1 Icons 310, 320, 330, 340, and 350 necessary to control the VR space may be included.

The device 10 according to one embodiment may display advertisements 360 on the user ' s UI 300. Here, the advertisement 360 can be selected as the profile information of the user. The device 10 may not display the advertisement 360 on the user's UI 300 according to the user's selection. On the other hand, the user can acquire a point by viewing the advertisement by displaying the advertisement 360 on the UI 300 of the user. The device 10 may record information about the type and the number of the displayed advertisement 360, and may provide information recorded in the point providing server, thereby obtaining a point according to the viewing of the advertisement 360.

The device 10 according to one embodiment may display a search window 370 on the user ' s UI 300. [ The search window 370 may be displayed when a user desires to search for information desired by the user through a pen, a finger, a keyboard, or the like. The user can display the search window 370 on the user's UI 300 through signals such as voice, gesture, touch, and the like.

The device 10 according to one embodiment may display at least one virtual object 380, 390 on the user ' s UI 300. For example, the device 10 may display on the user's UI 300 a first virtual object 380, which is a three-dimensional image of the boat, if the object selected by the user is a boat. Here, the first virtual object 380 may be a real image of a kind of boat desired by the user, or may be a graphic image. In addition, the device 10 may display a second virtual object 390, which is a three-dimensional image of the dock, on the user's UI 300.

The device 10 according to an exemplary embodiment may provide an environment in which the user can feel intimacy with the VR space by setting the user's UI 300 as a user's own space on the VR space. Meanwhile, the above-described example is merely an example for explaining the UI 300 of the user, and various virtual objects other than the above-described virtual objects may be placed on the user's UI 300. [ For example, virtual objects for various objects such as vehicles, airplanes, pets, physical trainer, personal robots, and primary care physicians can be placed on the user's UI 300.

FIG. 4 is a diagram illustrating a method for a device 10 according to one embodiment to provide sports training through a user's UI 400. FIG.

Referring to FIG. 4, the device 10 may provide training information 415 to a user via the user's UI 400. [ The training information 415 may include an image, text, and a table of adjustment of the exercise operation. The user can enjoy the exercise regardless of the place without visiting the gym through the training information 415. [

Meanwhile, the device 10 according to an exemplary embodiment may place the user's avatar 410 on the UI 400 of the user. In addition, the device 10 can sense the operation of the user while exchanging sensed data with the exercise device 405 equipped with the motion sensor including the acceleration sensor, the gyroscope sensor, the air pressure sensor, and the like in real time. The device 10 can control the movement of the user's avatar 410 placed on the user's UI 400 according to the detected user's operation.

In addition, the device 10 may reflect the effect of the exercise performed by the user on the avatar 410 so that the user may feel a sense of accomplishment. For example, when the user performs upper body movement, the device 10 can display an existing avatar 410 by changing the muscle of the upper body to the increased avatar 420.

Meanwhile, the device 10 may provide a place on the VR space capable of performing a collective exercise of soccer, baseball, etc. through connection with at least one other user's device. For example, the device 10 may set a space accessible in common with at least one other user's device. Here, a method of setting a space that can be accessed in common can be matched with the method described above with reference to Fig. For example, a plurality of users can arrange virtual objects necessary for enjoying a group exercise such as soccer, basketball, etc., while communicating with each other through a template UI provided.

FIG. 5 is a diagram illustrating a method of providing a medical service through the UI 500 of the user 10 according to an exemplary embodiment of the present invention.

Referring to FIG. 5, the device 10 may provide the medical service to the user through the UI 500 of the user. For example, a user may display a virtual object 530 of a user's primary care physician on the user's UI 500 via the toolbar icon 110 described above with reference to FIG. In addition, the device 10 can receive information on the health status of the user while transmitting and receiving data in real time with the device of the user's primary care doctor or the server of the hospital where the user's primary care physician is located. In addition, the device 10 can transmit the biometric information of the user measured through the sensor worn by the user to the device of the user's primary caregiver or the server of the hospital.

The device of the user's primary care unit or the server of the hospital can transmit the medical information 510 analyzing the user's current health status to the device 10 based on the received biometric information of the user. The device 10 can display the received medical information 510 on the UI 500 of the user.

Also, on the UI 500 of the user, health management information 520 such as a prescription, a diet, or a drug administration method determined based on the health state of the user may be displayed.

The device 10 according to one embodiment can receive medical consultation and psychological counseling while making a real-time conversation with the user's primary care physician through a communication session connected with the device of the user's primary caregiver. In addition, the device 10 can display a virtual object 530 of the user's primary caregiver on the UI 500 of the user, thereby providing a medical service environment in which the user can feel more realism.

6 is a diagram illustrating a method of providing a service to a plurality of users through a UI 600 of a user according to an exemplary embodiment of the present invention.

Referring to FIG. 6, the device 10 may be connected to a plurality of other user's devices 20,30. Herein, a plurality of other users represent users who are preset to share a certain part of each other's information in the VR space with the user. A plurality of other users may be searched through the contact information pre-stored in the device 10 or may be searched by the user directly inputting the profile information on the VR space.

The device 10 can enjoy the entertainment service with other users by displaying the virtual objects selected by the user and the virtual objects selected by the other users on the UI 600 of the user. For example, a virtual object 610 of a first car selected by a user of the device 10, a virtual object 620 of a second car selected by a user of the second device 20, The virtual object 630 of the third car selected by the user of the third device 30 can be displayed. Here, the virtual object 610 of the first automobile, the virtual object 620 of the second automobile, and the virtual object 630 of the third automobile can be virtual objects provided in an MMORPG (Massive Multiplayer Online Role Playing Game) service have.

The device 10 according to one embodiment displays the virtual objects of the user as well as the virtual objects of other users together on the user's UI 600 and displays the data in real time with another user's device (e.g., 20) You can enjoy services such as games using virtual objects while sending and receiving.

FIG. 7 is a diagram for explaining a method in which a device 10 according to an embodiment provides history information of a user through a user's UI 700 in a VR space.

Referring to Fig. 7, the device 10 can reproduce the past situation based on images, moving pictures, etc. related to the user stored in the device 10. Fig.

The device 10 according to one embodiment can reproduce the first history information when the user selects a thumbnail image 710 for the first history information. For example, if the first history information is a video in which a user has cheered a soccer game with friends in the past, the device 10 can edit the movie as a three-dimensional image and display it on the UI 700 of the user.

According to another embodiment, the device 10 can reproduce the second history information when the user selects the thumbnail image 720 for the second history information. For example, in the case where the second history information is a video of a scene shot with a grandmother of a user who has died in the past, the device 10 reproduces the past grandmother of the user who has died in the past in the VR space, It can help you recall.

As described in the above example, the device 10 allows the user to view the thumbnail image 730 for the third history information, the thumbnail image 750 for the fourth history information, the thumbnail image 750 for the fifth history information, When selecting any one of the thumbnail images 760 for the history information, the selected history information can be reproduced and displayed on the user interface 700. [

FIG. 8 is a diagram for explaining a method for the device 10 according to an embodiment to set a user's UI through other external servers.

Referring to FIG. 8, the device 10 can transmit and receive information related to the user's UI 810 in real time through the home management server 810. The home management server 810 is a server for managing identification information of a plurality of users and UIs of a plurality of users. For example, the device 10 may load the pre-stored user's UI by inputting the user's identification information input from the user to the home management server 810. [ If the UI of the user assigned to the user does not exist, the UI of the user is created by allocating the UI template preset to the user based on the received identification information of the user as described above with reference to FIG. have.

The device 10 can search the web for objects to be placed on the user's UI. The device 10 can select a desired object among the retrieved objects and request a virtual object for the selected object. The home management server 810 can provide the environment in which the device 10 can easily purchase the virtual object by providing the information of the device 10 to the seller server 820 that provides the requested virtual object. The device 10 can settle a virtual object through direct communication with the seller server 820 or settle a virtual object through a connection with a payment means providing server (not shown) to which the seller server 820 is subscribed.

9 and 10 are block diagrams of a device 900 that provides a UI in a VR space in accordance with one embodiment.

9, a device 900 according to one embodiment may include a sensing unit 910, a processor 920, and an output unit 930. However, not all illustrated components are required. The device 900 may be implemented by more components than the components shown, and the device 900 may be implemented by fewer components.

10, a device 900 according to an embodiment of the present invention includes an A / V input unit 940, a sensing unit 910, a processor 920, and an output unit 930, A user input unit 950, a communication unit 960, and a memory 970.

Hereinafter, the components will be described in order.

The sensing unit 910 may sense at least one of the state of the device 900, the state around the device 900, and the state of the user wearing the device 900, and may transmit the sensed information to the processor 920 have.

The sensing unit 910 according to an exemplary embodiment may acquire identification information of a user wearing the device 900. Also, the sensing unit 910 can track the movement of the user wearing the device 900. [

Meanwhile, the sensing unit 910 may acquire biometric information of a user wearing the device 900 according to a predetermined period. Also, the sensing unit 910 can detect a change in the user's state while displaying the UI in the VR space.

The sensing unit 910 includes a magnetism sensor 911, an acceleration sensor 912, an on / humidity sensor 913, an infrared sensor 914, a gyroscope sensor 915, (GPS) 916, an air pressure sensor 917, a proximity sensor 918, and an RGB sensor 919, for example. According to another example, the sensing unit 910 may further include a bio-signal detection sensor capable of measuring a user's pulse, blood pressure, brain wave, and the like. The function of each sensor can be intuitively deduced from the name by those skilled in the art, so a detailed description will be omitted.

Processor 920 typically controls the overall operation of device 900. For example, the processor 920 may include a sensing unit 910, an output unit 930, an A / V input unit 940, a user input unit 950, and a communication unit 960 ) Can be generally controlled.

The processor 920 according to an exemplary embodiment may assign a predetermined template UI to a user in the VR space based on the identification information obtained through the sensing unit 910. [ In addition, the processor 920 can acquire profile information from the user and information about the object to be placed in the template UI set by the user through the user input unit 950. [ The processor 920 may generate the avatar of the user based on the profile information. In addition, the processor 920 can create a virtual object based on information about the object and place it on the template UI. Processor 920 can create a UI for the user by placing avatars and virtual objects on the template UI.

The processor 920 according to an exemplary embodiment may set a UI assigned to display at least one advertisement on one side of a UI allocated to the user.

The processor 920 according to an exemplary embodiment may generate health information of a user based on the biometric information obtained through the sensing unit 910. [ The processor 920 may generate at least one of health information for a user and a recommended sports book based on the generated health information.

The output unit 930 is for outputting an audio signal, a video signal, or a vibration signal. The output unit 930 may include a display unit 931 and an audio output unit 932.

The display unit 931 displays and outputs information processed by the device 900. [ For example, the display unit 931 may display a UI for a user generated by the processor 1020. [ Further, the display unit 931 may display at least one category of the objects that can be placed in the template UI, and may display a plurality of objects included in the selected category as the user selects one of the at least one category .

Also, the display unit 931 can display the payment information necessary for purchasing the object as the information about the object is received.

Meanwhile, when the display unit 931 and the touch pad have a layer structure and are configured as a touch screen, the display unit 931 can be used as an input device in addition to the output device. The display unit 931 may be a liquid crystal display, a thin film transistor-liquid crystal display, an organic light-emitting diode, a flexible display, a three-dimensional display A 3D display, and an electrophoretic display. And the device 900 may include two or more display portions 931 according to the implementation of the device 900. [ At this time, the two or more display units 931 may be arranged to face each other using a hinge.

The audio output unit 932 outputs audio data received from the communication unit 960 or stored in the memory 970. The sound output unit 932 also outputs sound signals related to functions (e.g., call signal reception tones, message reception tones, notification tones) performed in the device 900. The sound output unit 1032 may include a speaker, a buzzer, and the like.

An A / V (Audio / Video) input unit 940 is for inputting an audio signal or a video signal, and may include a camera (not shown) and a microphone (not shown). The camera can obtain an image frame such as a still image or a moving image through the image sensor in the video communication mode or the photographing mode. The image captured via the image sensor may be processed through the processor 920 or a separate image processing unit (not shown).

The image frame processed by the camera may be stored in the memory 970 or transmitted to the outside through the communication unit 960. [ The camera may be equipped with two or more cameras according to the configuration of the device 900.

The microphone receives an external acoustic signal and processes it as electrical voice data. For example, the microphone may receive acoustic signals from an external device or speaker. The microphone can use various noise cancellation algorithms to remove noise generated in receiving an external acoustic signal.

Meanwhile, the device 900 according to one embodiment may further include a lens (not shown). A user of the device 900 can sense an image output from the display unit 931 through a lens.

The user input unit 950 means means for the user to input data for controlling the device 900. [ For example, the user input unit 950 may include a key pad, a dome switch, a touch pad (contact type capacitance type, pressure type resistive type, infrared ray detection type, surface ultrasonic wave conduction type, A tension measuring method, a piezo effect method, etc.), a jog wheel, a jog switch, and the like, but is not limited thereto.

The user input 950 can receive user input. In addition, the user input unit 950 may interact with the UI module 971 to receive a user input for selecting at least one item displayed on the sensing area of each of the plurality of sensors. However, this is only an example, and the type of the user input received by the user input unit 950 is not limited to the above example.

The communication unit 960 may include one or more components that allow communication between the device 900 and an external device (e.g., an HMD). For example, the communication unit 960 may include a short range communication unit 961, a mobile communication unit 962, and a broadcast receiving unit 963. [

The short-range wireless communication unit 961 includes a Bluetooth communication unit, a BLE (Bluetooth Low Energy) communication unit, a Near Field Communication unit, a WLAN communication unit, a Zigbee communication unit, IrDA, an infrared data association) communication unit, a WFD (Wi-Fi Direct) communication unit, an UWB (ultra wideband) communication unit, an Ant + communication unit, and the like.

The mobile communication unit 962 transmits and receives a radio signal to at least one of a base station, an external terminal, and a server on a mobile communication network. Here, the wireless signal may include various types of data depending on a voice call signal, a video call signal, or a text / multimedia message transmission / reception.

The broadcast receiver 963 receives broadcast signals and / or broadcast-related information from outside via a broadcast channel. The broadcast channel may include a satellite channel and a terrestrial channel. The device 900 may not include the broadcast receiver 963 according to an embodiment.

The memory 970 may store a program for processing and control of the processor 920 and may store input / output data (e.g., pixel values of images).

The memory 970 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (e.g., SD or XD memory), a RAM (Random Access Memory) SRAM (Static Random Access Memory), ROM (Read Only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory), PROM (Programmable Read-Only Memory) , An optical disc, and the like. The device 900 may also operate a web storage or a cloud server that performs storage functions of the memory 970 on the Internet.

Programs stored in the memory 970 can be classified into a plurality of modules according to their functions, for example, a UI module 971 and a touch screen module 972.

The UI module 971 can provide a specialized UI, a GUI, and the like that are interlocked with the device 900 for each application. The touchscreen module 972 may detect touch gestures on the user's touch screen and may pass information to the processor 920 about the touch gestures. The touch screen module 972 according to one embodiment can recognize and analyze the touch code. The touch screen module 972 may be configured as separate hardware including a controller.

Various sensors may be provided in or near the touch screen to sense the touch or near touch of the touch screen. An example of a sensor for sensing the touch of the touch screen is a tactile sensor. A tactile sensor is a sensor that detects the contact of a specific object with a degree or more that a person feels. The tactile sensor can detect various information such as the roughness of the contact surface, the rigidity of the contact object, and the temperature of the contact point.

In addition, a proximity sensor is an example of a sensor for sensing the touch of the touch screen.

The proximity sensor refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or a nearby object without mechanical contact using the force of an electromagnetic field or infrared rays. Examples of proximity sensors include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. The user's touch gestures can include tap, touch & hold, double tap, drag, panning, flick, drag and drop, swipe, and the like.

An apparatus according to the present invention may include a processor, a memory for storing and executing program data, a permanent storage such as a disk drive, a communication port for communicating with an external device, a user interface such as a touch panel, a key, Devices, and the like. Methods implemented with software modules or algorithms may be stored on a computer readable recording medium as computer readable codes or program instructions executable on the processor. Here, the computer-readable recording medium may be a magnetic storage medium such as a read-only memory (ROM), a random-access memory (RAM), a floppy disk, a hard disk, ), And a DVD (Digital Versatile Disc). The computer-readable recording medium may be distributed over networked computer systems so that computer readable code can be stored and executed in a distributed manner. The medium is readable by a computer, stored in a memory, and executable on a processor.

All documents, including publications, patent applications, patents, etc., cited in the present invention may be incorporated into the present invention in the same manner as each cited document is shown individually and specifically in conjunction with one another, .

In order to facilitate understanding of the present invention, reference will be made to the preferred embodiments shown in the drawings, and specific terminology is used to describe the embodiments of the present invention. However, the present invention is not limited to the specific terminology, Lt; / RTI > may include all elements commonly conceivable by those skilled in the art.

The present invention may be represented by functional block configurations and various processing steps. These functional blocks may be implemented in a wide variety of hardware and / or software configurations that perform particular functions. For example, the present invention may include integrated circuit configurations, such as memory, processing, logic, look-up tables, etc., that may perform various functions by control of one or more microprocessors or other control devices Can be adopted. Similar to the components of the present invention that may be implemented with software programming or software components, the present invention may be implemented as a combination of C, C ++, and C ++, including various algorithms implemented with data structures, processes, routines, , Java (Java), assembler, and the like. Functional aspects may be implemented with algorithms running on one or more processors. Further, the present invention can employ conventional techniques for electronic environment setting, signal processing, and / or data processing. Terms such as "mechanism", "element", "means", "configuration" may be used broadly and are not limited to mechanical and physical configurations. The term may include the meaning of a series of routines of software in conjunction with a processor or the like.

The specific acts described in the present invention are, by way of example, not intended to limit the scope of the invention in any way. For brevity of description, descriptions of conventional electronic configurations, control systems, software, and other functional aspects of such systems may be omitted. Also, the connections or connecting members of the lines between the components shown in the figures are illustrative of functional connections and / or physical or circuit connections, which may be replaced or additionally provided by a variety of functional connections, physical Connection, or circuit connections. Also, unless explicitly mentioned, such as "essential "," importantly ", etc., it may not be a necessary component for application of the present invention.

The use of the terms "above" and similar indication words in the specification of the present invention (particularly in the claims) may refer to both singular and plural. In addition, in the present invention, when a range is described, it includes the invention to which the individual values belonging to the above range are applied (unless there is contradiction thereto), and each individual value constituting the above range is described in the detailed description of the invention The same. Finally, the steps may be performed in any suitable order, unless explicitly stated or contrary to the description of the steps constituting the method according to the invention. The present invention is not necessarily limited to the order of description of the above steps. The use of all examples or exemplary language (e.g., etc.) in this invention is for the purpose of describing the present invention only in detail and is not to be limited by the scope of the claims, It is not. It will also be appreciated by those skilled in the art that various modifications, combinations, and alterations may be made depending on design criteria and factors within the scope of the appended claims or equivalents thereof.

Claims (8)

A method for providing a user interface (UI) in a VR space by a device,
Obtaining identification information from a user of the device;
Assigning a predetermined template UI for the user in the VR space based on the obtained identification information;
Receiving profile information from the user and information about an object the user desires to place in the predetermined template UI; And
Placing a virtual object created on the basis of the avatar of the user and information about the object created based on the profile information on the template UI to generate a UI for the user.
The method according to claim 1,
Displaying at least one category of objects deployable in the template UI;
Further comprising displaying a plurality of objects included in the selected category as the user selects one of the at least one category.
The method according to claim 1,
Displaying payment information necessary for the purchase of the object as the information about the object is received; And
Receiving an input relating to the payment means corresponding to the payment information from the user.
The method according to claim 1,
Displaying at least one advertisement on one side of the assigned UI; And
Further comprising displaying on the assigned UI a web page associated with the selected advertisement if any of the displayed at least one advertisement is selected.
The method according to claim 1,
Acquiring biometric information of a user wearing the device according to a predetermined period; And
And generating health information of the user based on the obtained biometric information.
6. The method of claim 5,
Generating at least one of health information and recommended sports information for the user based on the generated health information; And
And displaying the generated information on the assigned UI.
The method according to claim 1,
Detecting a change in the state of the user while displaying the UI in the VR space; And
And changing the user's avatar according to the detected change.
1. A device for providing a user interface (UI) in a VR space,
A sensing unit for acquiring identification information from a user of the device;
Allocates a predetermined template UI for the user in the VR space based on the obtained identification information, receives profile information from the user and information about an object that the user desires to place in the predetermined template UI A processor for creating a UI for the user by placing a virtual object generated based on the avatar of the user and information about the object created based on the profile information on the template UI; And
And an output unit for displaying a UI for the generated user.
KR1020170053229A 2016-04-25 2017-04-25 Method and device for providing user interface in the virtual reality space and recordimg medium thereof KR101981774B1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR20160050121 2016-04-25
KR1020160050121 2016-04-25

Publications (2)

Publication Number Publication Date
KR20170121718A true KR20170121718A (en) 2017-11-02
KR101981774B1 KR101981774B1 (en) 2019-05-27

Family

ID=60159900

Family Applications (4)

Application Number Title Priority Date Filing Date
KR1020170053232A KR101894022B1 (en) 2016-04-25 2017-04-25 Method and device for payment processing in virtual reality space
KR1020170053229A KR101981774B1 (en) 2016-04-25 2017-04-25 Method and device for providing user interface in the virtual reality space and recordimg medium thereof
KR1020170053230A KR20170121719A (en) 2016-04-25 2017-04-25 Method and device for providing user interface in the virtual reality space and recordimg medium thereof
KR1020170053231A KR101894021B1 (en) 2016-04-25 2017-04-25 Method and device for providing content and recordimg medium thereof

Family Applications Before (1)

Application Number Title Priority Date Filing Date
KR1020170053232A KR101894022B1 (en) 2016-04-25 2017-04-25 Method and device for payment processing in virtual reality space

Family Applications After (2)

Application Number Title Priority Date Filing Date
KR1020170053230A KR20170121719A (en) 2016-04-25 2017-04-25 Method and device for providing user interface in the virtual reality space and recordimg medium thereof
KR1020170053231A KR101894021B1 (en) 2016-04-25 2017-04-25 Method and device for providing content and recordimg medium thereof

Country Status (2)

Country Link
KR (4) KR101894022B1 (en)
WO (1) WO2017188696A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190082485A (en) * 2018-01-02 2019-07-10 주식회사 한글과컴퓨터 Hmd device for displaying vr based presentation document and operating method thereof
KR20200086872A (en) * 2019-01-10 2020-07-20 제이에스씨(주) System for health care service for realistic experience based on virtual reality and augmented reality
WO2023043012A1 (en) * 2021-09-15 2023-03-23 사회복지법인 삼성생명공익재단 Biofeedback method using image content, computer program, and system

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108227927B (en) * 2018-01-09 2021-07-23 北京小米移动软件有限公司 VR-based product display method and device and electronic equipment
KR20190104753A (en) * 2018-03-02 2019-09-11 주식회사 브이알이지이노베이션 Method and Apparatus for providing virtual reality cinema service
KR102084970B1 (en) * 2018-05-02 2020-03-05 (주)포트러시 Virtual reality viewing method and virtual reality viewing system
KR102389335B1 (en) * 2019-04-01 2022-04-20 주식회사 케이티 Apparatus and method for displaying videoes of a plurality of broadcast channels
KR102298101B1 (en) * 2019-07-31 2021-09-02 박준영 System of Selling Products for Adult and Driving Method Thereof
KR102272503B1 (en) * 2020-09-18 2021-07-02 주식회사 메이크잇 Method and system for trading financial product through head mounted display device
KR102272841B1 (en) * 2020-11-27 2021-07-06 주식회사 비욘드테크 System for providing broadcast video and billing settlement for providing 360-degree broadcast video content service and method thereof
TWI802909B (en) * 2021-06-15 2023-05-21 兆豐國際商業銀行股份有限公司 Financial transaction system and operation method thereof
KR102627728B1 (en) * 2021-11-02 2024-01-23 주식회사 엘지유플러스 Metaverse personalized content creation and authentication method and apparutus and system therefor
KR102654350B1 (en) * 2023-04-28 2024-04-03 주식회사 젭 Method and system for controlling access to specific area in metaverse space based on blockchain-recorded data

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20020074349A (en) * 2001-03-20 2002-09-30 엘지전자주식회사 service system for making of homepage and operation method of this system
KR100408989B1 (en) * 2002-10-08 2003-12-11 Robopia Co Ltd Running machine and operating system and method thereof
KR20060134290A (en) * 2005-06-22 2006-12-28 원용진 Portal-site linking system and portal-site linking method
KR20090017344A (en) * 2007-08-14 2009-02-18 광주과학기술원 Portable device for managing user's health and method of managing user's health using the same
JP2011039860A (en) * 2009-08-13 2011-02-24 Nomura Research Institute Ltd Conversation system, conversation method, and computer program using virtual space
KR20120019007A (en) * 2010-08-24 2012-03-06 한국전자통신연구원 System and method for providing virtual reality linking service
JP2015177403A (en) * 2014-03-17 2015-10-05 セイコーエプソン株式会社 Head-mounted display device and method of controlling head-mounted display device

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20020076973A (en) * 2001-03-31 2002-10-11 데이타박스(주) A cookig and food analizing system for using a fuzzy-neural networe and a method thereof
KR100952394B1 (en) * 2007-12-26 2010-04-14 에스케이커뮤니케이션즈 주식회사 Space management method for virtual reality service
KR20110131680A (en) * 2010-05-31 2011-12-07 영남이공대학 산학협력단 System and method for providing virtual internet shopping mall service by using 3d vitual reality technology
KR20120075565A (en) * 2010-12-15 2012-07-09 고스트리트(주) Mobile sports guide system and method using augmented reality
KR20140065180A (en) * 2012-11-21 2014-05-29 한국전자통신연구원 Apparatus and method for providing user experiential contents based on real time broadcast contents
JP2016506914A (en) * 2013-01-16 2016-03-07 アンセルムInserm Soluble fibroblast growth factor receptor 3 (FGR3) polypeptide for use in the prevention or treatment of skeletal growth retardation disorders
KR20140135276A (en) * 2013-05-07 2014-11-26 (주)위메이드엔터테인먼트 Method and Apparatus for processing a gesture input on a game screen
KR101569465B1 (en) * 2013-07-05 2015-11-17 서용창 Method for transmitting a message, method for selling a message box and computer readable recording medium storing program for the same
KR101517436B1 (en) * 2013-08-30 2015-05-06 주식회사 지스푼 Method and system for providing augmented reality
KR101878144B1 (en) * 2013-11-06 2018-07-13 엘지전자 주식회사 An Apparatus and Method of Providing User Interface on Head Mounted Display and a Head Mounted Display Thereof
US10203762B2 (en) * 2014-03-11 2019-02-12 Magic Leap, Inc. Methods and systems for creating virtual and augmented reality
KR102227659B1 (en) * 2014-03-12 2021-03-15 삼성전자주식회사 System and method for displaying vertual image via head mounted device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20020074349A (en) * 2001-03-20 2002-09-30 엘지전자주식회사 service system for making of homepage and operation method of this system
KR100408989B1 (en) * 2002-10-08 2003-12-11 Robopia Co Ltd Running machine and operating system and method thereof
KR20060134290A (en) * 2005-06-22 2006-12-28 원용진 Portal-site linking system and portal-site linking method
KR20090017344A (en) * 2007-08-14 2009-02-18 광주과학기술원 Portable device for managing user's health and method of managing user's health using the same
JP2011039860A (en) * 2009-08-13 2011-02-24 Nomura Research Institute Ltd Conversation system, conversation method, and computer program using virtual space
KR20120019007A (en) * 2010-08-24 2012-03-06 한국전자통신연구원 System and method for providing virtual reality linking service
JP2015177403A (en) * 2014-03-17 2015-10-05 セイコーエプソン株式会社 Head-mounted display device and method of controlling head-mounted display device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190082485A (en) * 2018-01-02 2019-07-10 주식회사 한글과컴퓨터 Hmd device for displaying vr based presentation document and operating method thereof
KR20200086872A (en) * 2019-01-10 2020-07-20 제이에스씨(주) System for health care service for realistic experience based on virtual reality and augmented reality
WO2023043012A1 (en) * 2021-09-15 2023-03-23 사회복지법인 삼성생명공익재단 Biofeedback method using image content, computer program, and system

Also Published As

Publication number Publication date
KR20170121720A (en) 2017-11-02
KR101894022B1 (en) 2018-08-31
KR20170121719A (en) 2017-11-02
KR20170121721A (en) 2017-11-02
KR101894021B1 (en) 2018-08-31
WO2017188696A1 (en) 2017-11-02
KR101981774B1 (en) 2019-05-27

Similar Documents

Publication Publication Date Title
KR101981774B1 (en) Method and device for providing user interface in the virtual reality space and recordimg medium thereof
US20210111890A1 (en) Systems and methods for authenticating a user on an augmented, mixed and/or virtual reality platform to deploy experiences
US9836929B2 (en) Mobile devices and methods employing haptics
JP2022046670A (en) System, method, and medium for displaying interactive augmented reality presentation
US20180288391A1 (en) Method for capturing virtual space and electronic device using the same
US20150264432A1 (en) Selecting and presenting media programs and user states based on user states
KR20160128119A (en) Mobile terminal and controlling metohd thereof
JP7036327B2 (en) Rehabilitation system and image processing equipment for higher brain dysfunction
US11755111B2 (en) Spatially aware computing hub and environment
KR20170012979A (en) Electronic device and method for sharing image content
JP6822413B2 (en) Server equipment, information processing methods, and computer programs
US20230185364A1 (en) Spatially Aware Computing Hub and Environment
CN113613028A (en) Live broadcast data processing method, device, terminal, server and storage medium
US10732989B2 (en) Method for managing data, imaging, and information computing in smart devices
CN110651304A (en) Information processing apparatus, information processing method, and program
WO2018122709A1 (en) Wearable augmented reality eyeglass communication device including mobile phone and mobile computing via virtual touch screen gesture control and neuron command
CN110446996A (en) A kind of control method, terminal and system
CN112131473A (en) Information recommendation method, device, equipment and storage medium
WO2023026546A1 (en) Information processing device
CN117010965A (en) Interaction method, device, equipment and medium based on information stream advertisement
CN111859199A (en) Locating content in an environment
CN112041787A (en) Electronic device for outputting response to user input using application and method of operating the same
JP7375143B1 (en) Programs and information processing systems
JP7270196B2 (en) Rehabilitation system and image processing device for higher brain dysfunction
JP7029717B1 (en) Rehabilitation system and image processing equipment for higher brain dysfunction

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E90F Notification of reason for final refusal
E701 Decision to grant or registration of patent right