KR20130068593A - Metaverse platform for fusing actual feeling and method for providing service using the same - Google Patents

Metaverse platform for fusing actual feeling and method for providing service using the same Download PDF

Info

Publication number
KR20130068593A
KR20130068593A KR1020110135881A KR20110135881A KR20130068593A KR 20130068593 A KR20130068593 A KR 20130068593A KR 1020110135881 A KR1020110135881 A KR 1020110135881A KR 20110135881 A KR20110135881 A KR 20110135881A KR 20130068593 A KR20130068593 A KR 20130068593A
Authority
KR
South Korea
Prior art keywords
behavior
metaverse
user
sensory
information
Prior art date
Application number
KR1020110135881A
Other languages
Korean (ko)
Inventor
박노삼
박상욱
장종현
박광로
Original Assignee
한국전자통신연구원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한국전자통신연구원 filed Critical 한국전자통신연구원
Priority to KR1020110135881A priority Critical patent/KR20130068593A/en
Publication of KR20130068593A publication Critical patent/KR20130068593A/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Tourism & Hospitality (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Human Resources & Organizations (AREA)
  • General Business, Economics & Management (AREA)
  • General Health & Medical Sciences (AREA)
  • Economics (AREA)
  • Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

PURPOSE: A reality-fused metaverse platform device and a service providing method thereof are provided to support an asynchronous behavior and a synchronous behavior by recognizing the behavior of wire and wireless network technology and reality sensor technology. CONSTITUTION: A sensing information receiving unit(310) receives sensing information sensing a user behavior from clients. A user behavior storage unit(320) stores behavior information. A behavior recognition processing unit(330) divides the user behavior into an asynchronous behavior and a synchronous behavior based on the sensing information. If the user behavior is the synchronous behavior, a metaverse management unit(370) directly applies the behavior to the movement of an avatar in a metaverse virtual space. If the user behavior is the asynchronous behavior, the metaverse management unit applies the asynchronous behavior to the movement of the avatar. [Reference numerals] (310) Sensing information receiving unit; (320) User behavior storage unit; (330) Behavior recognition processing unit; (340) Social network processing unit; (350) User information storage unit; (AA) Content providing unit; (BB) Metaverse management unit

Description

METALVERSE PLATFORM FOR FUSING ACTUAL FEELING AND METHOD FOR PROVIDING SERVICE USING THE SAME}

The present invention relates to a sensory convergence metaverse platform device and a service providing method using the same, and more specifically, to provide a realistic entertainment game service, the user's actions are recognized and reflected in the service, and social networking of multiple users is established. The present invention relates to a sensory convergence metaverse platform device and a service providing method using the same.

Recently, Second Life, Nintendo, Google, Microsoft, etc. have been actively attempting to induce game users' interest by effectively expressing and inferring relationships between game users.

In addition, realistic console games are being released, and in the near future, such games are expected to come online so that a plurality of users interact in one virtual space.

In addition, the importance of recognition of schema for efficiently expressing social relations between game users and providing user convenience of games through social inference is emerging, and mixed reality-based user-centered realistic type in which reality and virtual can coexist. The demand for entertainment services is skyrocketing.

Platforms to which unit technology including the prior art Massively Multiplayer Online Game (MMOG) server technology are applied are difficult to apply to realistic converged entertainment services, and there is a problem in that they cannot support convergent services such as user behavior recognition and social networking.

In addition, there are some technologies that use the acceleration and angular velocity-based sensors to recognize the user's behavior and apply it to the application for the realistic game, but since they simply reflect the user's movement in the game, they use all of the five senses There is a problem that can not provide services.

The present invention has been proposed to solve the above-mentioned conventional problems, and reflects user's behavior data to an avatar in the metaverse engine, which is a virtual space, and creates a social network between users. An object of the present invention is to provide a device and a service providing method using the same.

In order to achieve the above object, a sensory convergence metaverse platform apparatus according to an embodiment of the present invention includes: a sensing information receiver configured to receive sensing information that detects a user's behavior from a plurality of clients; A processing unit for recognizing a user's behavior by classifying it into a synchronous behavior and an asynchronous behavior based on the received sensing information; And if the perceived user behavior is a synchronous behavior, it is directly reflected in the avatar's movement in the metaverse virtual space. If the perceived user behavior is asynchronous, the cumulative asynchronous behavior is directly reflected in the metaverse virtual space's movement. It includes a metaverse management unit.

The apparatus may further include a social network processor configured to generate a social network between users based on a lifelog including user information and log information of a connected client.

The apparatus may further include a social network processor that detects a communication partner and provides the connected client to the connected client based on the social network generated using the lifelog including the user information and log information of the connected client.

The social network processor detects a communication partner by calculating an intimacy of a client connected with other clients in the metaverse virtual space.

The social network processor detects a communication partner by calculating an intimacy of a client connected with other clients in the metaverse virtual space.

In order to achieve the above object, a service providing method using a sensory convergence metaverse platform device according to an embodiment of the present invention includes: receiving, by a sensing information receiver, sensing information that senses user behavior from a connected client; ; Recognizing, by the behavior recognition processor, the user behavior based on the received sensing information by classifying the user behavior into a synchronous behavior and an asynchronous behavior; Controlling by the metaverse manager to directly reflect the movement of the avatar in the metaverse virtual space if the recognized user behavior is a synchronous behavior; And controlling, by the metaverse manager, the accumulated asynchronous behavior by directly reflecting the movement of the avatar in the metaverse virtual space if the recognized user behavior is an asynchronous behavior.

Generating, by the social network processing unit, a social network between users based on a lifelog including user information and log information of a connected client.

The method may further include detecting, by the social network processing unit, the communication partner based on the social network generated using the lifelog including the user information and log information of the connected client to the connected client.

In the step of detecting and providing the communication partner to the connected client, the social network processor detects the communication partner by calculating the intimacy of the client connected with other clients in the metaverse virtual space.

In the step of detecting and providing the communication partner to the connected client, the social network processor detects the communication partner by calculating the intimacy of the client connected with other clients in the metaverse virtual space.

According to the present invention, the sensory convergence metaverse platform device and the service providing method using the same can be applied to various fields based on collected behavior information while enabling the recognition of sensory sensor technology and wired / wireless network technology. It can support not only recognition but also asynchronous behavior.

In addition, the sensory convergence metaverse platform device and the method of providing the service using the same method analyzes the behavior of many anonymous persons simultaneously through distributed inference, rather than centralized inference, to control the behavior of the metaverse avatar, so that the user feels and realizes It has the effect of enhancing the impression.

In addition, the sensory convergence metaverse platform device and the service providing method using the same are utilized in analyzing the user's disposition, behavior pattern, and lifestyle through the data mining technique, and the analyzed user's disposition is similar to the user. There is an effect that can be utilized as a parameter, such as social network configuration, individual individual services.

In addition, the sensory convergence metaverse platform device and the service providing method using the same establish a social network of each user by analyzing log information of users in the metaverse environment, and establish a communication partner of the user based on the established social network. By recommending, you can support interactive social networks and promote user-centric engagement. Supporting the next generation user community based on social networks in the virtual space, there is an effect that can be applied to the development of context-aware technology based on this.

1 is a block diagram illustrating a sensory fusion metaverse platform device according to an embodiment of the present invention.
2 is a view for explaining an application example of the sensory fusion metaverse platform device according to an embodiment of the present invention.
3 is a block diagram illustrating a configuration of a sensory fusion metaverse platform device according to an embodiment of the present invention.
4 and 5 are block diagrams for explaining the action recognition processor of FIG.
6 is a block diagram illustrating a social network processing unit of FIG. 3.
7 is a flowchart illustrating a service providing method using a sensory convergence metaverse platform device according to a first embodiment of the present invention.
8 is a flowchart illustrating a service providing method using a sensory convergence metaverse platform device according to a second embodiment of the present invention.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings in order to facilitate a person skilled in the art to easily carry out the technical idea of the present invention. . In the drawings, the same reference numerals are used to designate the same or similar components throughout the drawings. In the following description of the present invention, a detailed description of known functions and configurations incorporated herein will be omitted when it may make the subject matter of the present invention rather unclear.

First, before describing the embodiments of the present invention will be described the features of the present invention. Metaverse is a compound word of meta which means processing and abstraction and universe which means real world, and the virtual world such as web and internet is absorbed into the real world. The present invention operates in a metaverse engine environment that projects the behavior of a user in the real world onto an avatar.

In the present invention, the user's actions are classified into asynchronous and synchronous actions.

Whether it is asynchronous behavior or not, the user collects sensing data from everyday life and analyzes the collected data to infer everyday life. Inference results through asynchronous behavior recognition are used to analyze the user's disposition, behavior pattern, and lifestyle.

In order to overcome the limitations of the game through the existing simple input device, the synchronous behavior recognizes the behavior synchronously by applying the behavior recognition technology and improves the reality and immersion degree by reflecting the user's behavior information in the game. . Attach the multi-sensor to the user's arms, legs, waist, etc. to collect behavior information and posture by integrating information that can be collected from each sensor (3-axis acceleration, 3-axis angular velocity, 3-axis geomagnetic field, etc.) Recognize inferences and actions.

In the present invention, the social network middleware configures the user's social network based on the user's log information stored in the metaverse server, and recommends a communication partner based on the social network.

In order to build the social network, OWL (Web Ontology Language) is used to express user information in a metaverse and form a social relationship between users.

In order to match appropriate communication partners based on social networks, we calculate the intimacy between nodes constituting the network and define rules for recommending the partners.

The social network middleware automatically builds metaverse user information into a social network, receives user log information of the metaverse, and delivers the relative information recommended through the recommendation algorithm to the metaverse.

Hereinafter, the sensory fusion metaverse platform device according to an embodiment of the present invention will be described in detail with reference to the accompanying drawings. 1 is a block diagram illustrating a sensory fusion metaverse platform device according to an embodiment of the present invention. 2 is a view for explaining an application example of the sensory fusion metaverse platform device according to an embodiment of the present invention. 3 is a block diagram illustrating a configuration of a sensory fusion metaverse platform device according to an embodiment of the present invention. 4 and 5 are block diagrams illustrating the behavior recognition processor of FIG. 3, and FIG. 6 is a block diagram illustrating the social network processor of FIG. 3.

As shown in FIG. 1, the sensory convergence metaverse platform device 300 is connected to a plurality of clients 100 and a plurality of content servers 200 through a network.

The client 100 is composed of a smartphone, a game machine, a PC and the like. The client 100 connects to the sensory convergence metaverse platform device 300 through a game application of a smartphone, a game machine, and a game software of a PC. The client 100 controls the avatar of the metaverse virtual space through a connection with the sensory convergence metaverse platform device 300 and uses a service such as a game. At this time, the client 100 collects the sensing information according to the movement of the user and transmits it to the sensory convergence metaverse platform device 300. That is, the client 100 is connected to the sensory fusion metaverse platform device 300 through a sensor for detecting an action attached to a user's body, and senses the sensing information sensed through the sensor for sensory sensory convergence. To 300). To this end, the client 100 is configured to include a sensor location tracking module. At this time, the sensor position tracking module tracks the position and direction of the sensor through the data fusion module using the extended Kalman filter on the sensing information collected from the periodic sensing module. The sensor location tracking module transmits the tracked position and direction of the sensor to the sensory convergence metaverse platform device 300 through the communication module 332.

The content server 200 stores and manages content provided to the user through the client 100. At this time, the content server 200 provides the content requested by the client 100 to the client 100 through the sensory convergence metaverse platform device 300. Of course, the content server 200 may directly provide the content to the client 100 using metaverse information for controlling the content from the sensory fusion metaverse platform device 300.

The sensory convergence metaverse platform device 300 analyzes the sensing information from the client 100, recognizes the user's behavior, and reflects it to the avatar in the metaverse virtual space. That is, the sensory convergence metaverse platform device 300 receives the content from the content server 200 and receives the sensing information from the client 100. The sensory convergence metaverse platform device 300 analyzes the received sensing information to recognize the user's behavior and controls the avatar's behavior by reflecting the recognized user's behavior on the avatar on the content.

The sensory convergence metaverse platform device 300 constructs a social network between users using a lifelog (ie, user information and log information) of a user who uses the connected client 100, and infers a social relationship between users. To match the communication partner.

The sensory convergence metaverse platform device 300 provides a communication function such as a message transmission between the clients 100 corresponding to the matched communication partner, and the clients 100 corresponding to the matched communication partner are provided on the metaverse. Make the service available together. In this case, as shown in FIG. 2, the sensory convergence metaverse platform device 300 uses the metaverse platform-based client 100 and other users 400 connected to an online game, not based on the metaverse platform. It may also provide an interface between connected users.

To this end, as shown in FIG. 3, the sensory convergence metaverse platform apparatus 300 includes a sensing information receiver 310, a user behavior storage 320, a behavior recognition processor 330, and a social network processor 340. , A user information storage unit 350, a metaverse manager 360, and a content provider 370.

The sensing information receiver 310 receives sensing information from the plurality of clients 100. That is, the sensing information receiver 310 receives sensing information corresponding to the user's behavior from the client 100 to reflect the user's behavior in the content.

The user behavior storage 320 may store behavior information that is preset (or collected in advance) for recognizing user behavior.

The behavior recognition processor 330 performs behavior recognition for the user by using the sensing information received through the sensing information receiver 310. At this time, the behavior recognition processor 330 recognizes the user's behavior with reference to the behavior information stored in the user behavior storage 320.

To this end, as shown in FIG. 4, the behavior recognition processor 330 is configured to include a posture estimation module 331 to apply a synchronous behavior. At this time, the attitude estimation module 331 estimates the user's attitude based on the position and the direction of the sensor received from the sensor location tracking module of the client 100. The posture estimation module 331 transmits the estimated posture of the user to the metaverse manager 360 to reflect the user's actions in the content. Here, the behavior recognition processor 330 may further include a communication module 332 for exchanging information with the sensor location tracking module of the client 100 and a support module 333 for supporting multiple users.

At this time, as shown in Figure 5, the behavior recognition processor 330 is a filter module 334 to remove the noise of the sensing information received through the sensing information receiver 310 to apply the asynchronous behavior of the user, the noise is A leveling feature extraction module 335 for extracting feature values from the removed sensing information and equalizing weights by adjusting a range of different feature values, and a support vector machine (SVM) algorithm based on statistical learning theory. A pattern classification module 336 for classifying the pattern of the normalized sensing information, a wrapper module 337 for converting the normalized sensing information into the form of the SVM module 339, a high level behavior recognition engine module, and an output as a result of the SVM algorithm. The model storage module 338, which serves to store and extract the model files that are generated, equalizes the initial sensing information through the SVM algorithm, and through the normalized data. And an SVM module 339 that builds a reference model to predict behavioral awareness. In this case, the SVM module 339 includes a feature extraction module that minimizes the use of system resources by selecting only a feature suitable for behavior recognition among a plurality of features.

The social network processor 340 configures a social network between users using a lifelog (ie, user information and log information) of a user who uses the connected client 100. That is, the social network processing unit 340 applies the OWL horst inference technology to infer a relationship not specified by the user to configure the social network.

The social network processor 340 infers a social relationship between users through social network analysis methods such as data mining to match communication partners. The social network processor 340 recommends the matched communication partner to the user through the corresponding client 100. That is, when the social network processing unit 340 receives the match making request from the client 100, the social network processing unit 340 matches and makes an appropriate partner who can communicate in the metaverse based on the configured social network of the user. At this time, the social network processing unit 340 calculates the intimacy between the users on the multiverse based on the configured social network and recommends the communication partner of the user among other users in the multiverse. To this end, as shown in FIG. 6, the social network processing unit 340 receives a match making request from the client 100 and transmits a result to the match processing module 342 and the user information stored in the user information storage 350. SNS storage module 344 for analyzing the lifelog and converting to OWL triple format applicable to the inference engine, and inference module 346 for inferring social relations and expressing them in a network format by applying OWL2RL inference. .

The user information storage unit 350 stores a lifelog (ie, user information and log information) based on the social network configuration and social relationship inference. In this case, the user information storage 350 stores the behavior recognition information, the social network, and the communication partner information from the metaverse manager 360 in the user information storage 350.

The metaverse manager 360 manages a metaverse, which is a virtual space in which an avatar on which the real world is projected acts. That is, the metaverse manager 360 applies the behavior recognition information of the behavior recognition processor 330 and the social network and communication counterpart information of the social network processor 340 to the metaverse. The metaverse manager 360 transmits behavior recognition information, social network, and communication partner information to the user information storage 350.

The content provider 370 is connected to the plurality of clients 100 through a network, and the metaverse manager 360 receives metaverse-based content to which the behavior recognition information and social network and communication partner information are applied. 100). In this case, the content provider 370 is a data channel for stably transmitting and receiving large game data and content to a large-scaled user.

Hereinafter, a service providing method using a sensory convergence metaverse platform device according to a first embodiment of the present invention will be described in detail with reference to the accompanying drawings. 7 is a flowchart illustrating a service providing method using a sensory convergence metaverse platform device according to a first embodiment of the present invention. FIG. 7 illustrates a method of applying a user's behavior to an avatar in a metaverse virtual space using a sensory convergence metaverse platform device.

First, a user connects to the sensory fusion metaverse platform apparatus 300 using the client 100 based on the sensory fusion metaverse platform (S110). At this time, the client 100 is connected to the sensory fusion metaverse platform device 300 through a sensor that is attached to the user's body. The client 100 collects the sensing information detected by the behavior recognition sensor and transmits it to the sensory convergence metaverse platform device 300. At this time, the sensor position tracking module of the client 100 tracks the position and direction of the sensor through the data fusion module using the extended Kalman filter based on the sensing information collected from the periodic sensing module. The sensor location tracking module transmits the tracked position and direction of the sensor to the sensory convergence metaverse platform device 300 through the communication module 332.

When receiving the sensing information from the client 100 (S120; YES), the sensory convergence metaverse platform device 300 analyzes the sensing information (S130). At this time, the sensory convergence metaverse platform device 300 recognizes a user's behavior as a synchronous behavior or an asynchronous behavior with reference to previously stored behavior information. That is, the sensory convergence metaverse platform device 300 is based on the sensing information whether the user's behavior is synchronous behavior information that needs to be reflected in real time, the user's lifelog collected in the real world for a certain time (that is, user information and log) Information), which is asynchronous behavior information.

If it is divided into the synchronous behavior information (S140; Yes), the sensory convergence metaverse platform device 300 analyzes the user behavior (S150). That is, the sensory convergence metaverse platform device 300 estimates the user's posture based on the sensing information (ie, the position and direction of the sensor) in order to apply the user's synchronous behavior.

When the sensory convergence metaverse platform device 300 is classified into the synchronous behavior information, it analyzes the lifelog (ie, user information and log information) (S160).

The sensory convergence metaverse platform device 300 performs unit behavior analysis and definition (S170). That is, the sensory fusion metaverse platform device 300 classifies the behaviors that can be used in the game according to the rules set from the lifelog and converts the sensing information into a form that can infer the behaviors (S170).

The sensory convergence metaverse platform device 300 infers a user's behavior and disposition using the converted sensing information (S180). In this case, the sensory fusion metaverse platform device 300 infers a behavior primarily by inferring a user's place, behavior, and action time, and then infers a user's disposition and personality based on this.

The sensory convergence metaverse platform device 300 converts the inferred behavior and propensity into a script language that can be used in the metaverse (S190), and the user's behavior generated by analyzing the user's behavior or converted into the script language. The action is reflected in the metaverse (S200). That is, the sensory convergence metaverse platform device 300 reflects the user's behavior generated by analyzing the user's behavior or the user's behavior converted into the script language to the metaverse, and applies the user's behavior to the avatar in the virtual space of the metaverse. Reflect.

Hereinafter, a service providing method using a sensory convergence metaverse platform device according to a second embodiment of the present invention will be described in detail with reference to the accompanying drawings. 8 is a flowchart illustrating a service providing method using a sensory convergence metaverse platform device according to a second embodiment of the present invention. FIG. 8 is a description of a social network based match making method using a sensory convergence metaverse platform device. FIG.

First, the user connects to the sensory fusion metaverse platform device 300 by using the client 100 composed of a smartphone game application, a game machine, or the like based on the sensory convergence metaverse platform (S310).

The sensory convergence metaverse platform device 300 is a lifelog (ie, user information and log information) to the user information storage unit 350 that the user is active after connecting to the sensory convergence metaverse platform device 300. Accumulated and stored as (S320). Here, the lifelog includes information including personal preference information of the user, social relations with other users, and the like.

The sensory fusion metaverse platform device 300 converts accumulated lifelogs into OWL triple data (S320).

The sensory convergence metaverse platform device 300 generates a social network based on the converted OWL triple data (S330).

When the user requests a match making at a specific time while controlling the avatar in the virtual space of the metaverse through the client 100 (S340; YES), the sensory convergence metaverse platform device 300 creates a social network that has been created. The user searches for the users existing in the virtual space of the metaverse by loading (S350).

The sensory fusion metaverse platform device 300 calculates intimacy with the searched users (S360).

The sensory fusion metaverse platform device 300 detects a communication partner among the searched users based on the calculated intimacy and the social relationship between the users and recommends it to the client 100 (S370).

As described above, the sensory convergence metaverse platform device and the service providing method using the same can be applied to various fields based on collected behavior information while enabling the recognition of sensory sensor technology and wired / wireless network technology. It can support not only recognition but also asynchronous behavior.

In addition, the sensory convergence metaverse platform device and the method of providing the service using the same method analyzes the behavior of many anonymous persons simultaneously through distributed inference, rather than centralized inference, to control the behavior of the metaverse avatar, so that the user feels and realizes It has the effect of enhancing the impression.

In addition, the sensory convergence metaverse platform device and the service providing method using the same are utilized in analyzing the user's disposition, behavior pattern, and lifestyle through the data mining technique. There is an effect that can be utilized as a parameter, such as social network configuration, individual individual services.

In addition, the sensory convergence metaverse platform device and the service providing method using the same establish a social network of each user by analyzing log information of users in the metaverse environment, and establish a communication partner of the user based on the established social network. By recommending, you can support interactive social networks and promote user-centric engagement. Supporting the next generation user community based on social networks in the virtual space, there is an effect that can be applied to the development of context-aware technology based on this.

While the present invention has been described in connection with what is presently considered to be practical exemplary embodiments, it is to be understood that the invention is not limited to the disclosed embodiments, but many variations and modifications may be made without departing from the scope of the present invention. It will be understood that the invention may be practiced.

100: Client 200: Content Server
300: sensory convergence metaverse platform device 310: sensing information receiver
320: user behavior storage unit 330: behavior recognition processing unit
331: attitude estimation module 332: communication module
333: support module 334: filter module
335: leveling feature extraction module 336: pattern classification module
337: wrapper module 338: model storage module
339: SVM module 340: social network processing unit
342: Match processing module 344: SNS storage module
346: inference module 350: user information storage unit
360: metaverse management unit 370: content provider
400: other users

Claims (1)

A sensing information receiver configured to receive sensing information from the plurality of clients, the sensing information being detected by the user;
A processing unit for recognizing a user's behavior by classifying it into a synchronous behavior and an asynchronous behavior based on the received sensing information; And
If the perceived user behavior is a synchronous behavior, it is directly reflected and controlled in the movement of the avatar in the metaverse virtual space. If the perceived user behavior is asynchronous, the cumulative asynchronous behavior is directly reflected in the avatar movement in the metaverse virtual space. Realistic fusion metaverse platform device comprising a metaverse management unit for controlling.
KR1020110135881A 2011-12-15 2011-12-15 Metaverse platform for fusing actual feeling and method for providing service using the same KR20130068593A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020110135881A KR20130068593A (en) 2011-12-15 2011-12-15 Metaverse platform for fusing actual feeling and method for providing service using the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020110135881A KR20130068593A (en) 2011-12-15 2011-12-15 Metaverse platform for fusing actual feeling and method for providing service using the same

Publications (1)

Publication Number Publication Date
KR20130068593A true KR20130068593A (en) 2013-06-26

Family

ID=48864185

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020110135881A KR20130068593A (en) 2011-12-15 2011-12-15 Metaverse platform for fusing actual feeling and method for providing service using the same

Country Status (1)

Country Link
KR (1) KR20130068593A (en)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20180049274A (en) * 2014-03-25 2018-05-10 이베이 인크. Data mesh platform
KR102412142B1 (en) * 2022-01-10 2022-06-22 인트인 주식회사 Method for pattern-analyzing behavior of avatar in metaverse based on deep learning
KR102428990B1 (en) * 2021-09-08 2022-08-03 장혁 User-customized content recommendation system and method
KR20220146298A (en) 2021-04-23 2022-11-01 이상열 Metaverse system that provides an economic operating system to a fusion space where reality and virtualization are fused
KR20220165214A (en) 2021-06-07 2022-12-14 (주) 애니펜 Method, system, and non-transitory computer-readable recording medium for providing contents
WO2023017890A1 (en) * 2021-08-11 2023-02-16 한국전자기술연구원 Method for providing metaverse, apparatus for providing metaverse, and system for providing metaverse
KR20230027474A (en) 2021-08-19 2023-02-28 주식회사 사운드파인트리 Apparatus for Operating Hair Shop by Using Metaverse and Driving Method Thereof
KR20230060916A (en) 2021-10-28 2023-05-08 (주) 텔로스 Realistic contents virtual reality platform server, realistic contents virtual reality system including the same, and method of operating realistic contents virtual reality platform
KR20230096695A (en) 2021-12-23 2023-06-30 (주)가시 Metaverse platform system based on sensor and location information
KR20230102674A (en) 2021-12-30 2023-07-07 박재홍 Ynchronized system and method using nfc tag
KR20230102673A (en) 2021-12-30 2023-07-07 박재홍 Synchronized system and method between real world and metaverse
KR20230109829A (en) 2022-01-13 2023-07-21 아이작에스엔씨 주식회사 CellCity service system
KR20230112388A (en) * 2022-01-20 2023-07-27 주식회사 유비온 Method and system for maintaining avatar consistency among diverse metaverse platforms
KR20230120834A (en) 2022-02-10 2023-08-17 (주)가시 Business conference system using the metaverse gather town
KR20230134284A (en) 2022-03-14 2023-09-21 주식회사 비디 Metaverse resource management platform device
KR102615263B1 (en) * 2022-07-29 2023-12-19 (주)엣지디엑스 Metaverse system
KR20240000783A (en) 2022-06-24 2024-01-03 넥스트스토리 주식회사 System and Method for join-now game service based on metaverse
KR20240062575A (en) * 2022-11-02 2024-05-09 메타브릿지 주식회사 System for providing Metaverse flatform bridging real economy
WO2024128417A1 (en) * 2022-12-15 2024-06-20 주식회사 맥스트 Method for generating metaverse space and teleporting avatar in metaverse space
KR20240093031A (en) 2022-12-15 2024-06-24 주식회사 마크애니 Method for providing service differentially based on decentralized identifier

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11657443B2 (en) 2014-03-25 2023-05-23 Ebay Inc. Data mesh based environmental augmentation
US10719866B2 (en) 2014-03-25 2020-07-21 Ebay Inc. Complementary activity based on availability of functionality
US11100561B2 (en) 2014-03-25 2021-08-24 Ebay Inc. Data mesh visualization
US11120492B2 (en) 2014-03-25 2021-09-14 Ebay Inc. Device ancillary activity
US11210723B2 (en) 2014-03-25 2021-12-28 Ebay Inc. Data mesh based environmental augmentation
US12033204B2 (en) 2014-03-25 2024-07-09 Ebay Inc. Device ancillary activity
US11900437B2 (en) 2014-03-25 2024-02-13 Ebay Inc. Data mesh based environmental augmentation
KR20180049274A (en) * 2014-03-25 2018-05-10 이베이 인크. Data mesh platform
US11810178B2 (en) 2014-03-25 2023-11-07 Ebay Inc. Data mesh visualization
KR20220146298A (en) 2021-04-23 2022-11-01 이상열 Metaverse system that provides an economic operating system to a fusion space where reality and virtualization are fused
KR20220165214A (en) 2021-06-07 2022-12-14 (주) 애니펜 Method, system, and non-transitory computer-readable recording medium for providing contents
KR20230024451A (en) * 2021-08-11 2023-02-21 한국전자기술연구원 Method, apparatus, and system for providing metaverse
WO2023017890A1 (en) * 2021-08-11 2023-02-16 한국전자기술연구원 Method for providing metaverse, apparatus for providing metaverse, and system for providing metaverse
KR20230027474A (en) 2021-08-19 2023-02-28 주식회사 사운드파인트리 Apparatus for Operating Hair Shop by Using Metaverse and Driving Method Thereof
KR102428990B1 (en) * 2021-09-08 2022-08-03 장혁 User-customized content recommendation system and method
KR20230060916A (en) 2021-10-28 2023-05-08 (주) 텔로스 Realistic contents virtual reality platform server, realistic contents virtual reality system including the same, and method of operating realistic contents virtual reality platform
KR20230096695A (en) 2021-12-23 2023-06-30 (주)가시 Metaverse platform system based on sensor and location information
KR20230102673A (en) 2021-12-30 2023-07-07 박재홍 Synchronized system and method between real world and metaverse
KR20230102674A (en) 2021-12-30 2023-07-07 박재홍 Ynchronized system and method using nfc tag
KR102412142B1 (en) * 2022-01-10 2022-06-22 인트인 주식회사 Method for pattern-analyzing behavior of avatar in metaverse based on deep learning
KR20230109829A (en) 2022-01-13 2023-07-21 아이작에스엔씨 주식회사 CellCity service system
KR20230112388A (en) * 2022-01-20 2023-07-27 주식회사 유비온 Method and system for maintaining avatar consistency among diverse metaverse platforms
KR20230120834A (en) 2022-02-10 2023-08-17 (주)가시 Business conference system using the metaverse gather town
KR20230134284A (en) 2022-03-14 2023-09-21 주식회사 비디 Metaverse resource management platform device
KR20240000783A (en) 2022-06-24 2024-01-03 넥스트스토리 주식회사 System and Method for join-now game service based on metaverse
KR102615263B1 (en) * 2022-07-29 2023-12-19 (주)엣지디엑스 Metaverse system
KR20240062575A (en) * 2022-11-02 2024-05-09 메타브릿지 주식회사 System for providing Metaverse flatform bridging real economy
WO2024096501A1 (en) * 2022-11-02 2024-05-10 메타브릿지 주식회사 System for providing real economy-linked metaverse platform
WO2024128417A1 (en) * 2022-12-15 2024-06-20 주식회사 맥스트 Method for generating metaverse space and teleporting avatar in metaverse space
KR20240093031A (en) 2022-12-15 2024-06-24 주식회사 마크애니 Method for providing service differentially based on decentralized identifier

Similar Documents

Publication Publication Date Title
KR20130068593A (en) Metaverse platform for fusing actual feeling and method for providing service using the same
Xu et al. A full dive into realizing the edge-enabled metaverse: Visions, enabling technologies, and challenges
US11704939B2 (en) Liveness detection
CN105431813B (en) It is acted based on biometric identity home subscriber
US20210146255A1 (en) Emoji-based communications derived from facial features during game play
US10049287B2 (en) Computerized system and method for determining authenticity of users via facial recognition
US9965675B2 (en) Using virtual reality for behavioral analysis
EP2297649B1 (en) Providing access to virtual spaces that are associated with physical analogues in the real world
US9560094B2 (en) System and method for identifying and analyzing personal context of a user
CN102918518B (en) Individual characteristics profile data based on cloud
US20150304454A1 (en) System and method for providing virtual spaces for access by users via the web
WO2012140562A1 (en) System and method for developing evolving online profiles
EP2798517A2 (en) A method and system for creating an intelligent social network between plurality of devices
Jebbar et al. A fog-based architecture for remote phobia treatment
Ranathunga et al. Interfacing a cognitive agent platform with second life
Kim et al. Virtual world control system using sensed information and adaptation engine
Meng et al. De-anonymization attacks on metaverse
CN116383494A (en) Information resource pushing method, device and system based on live-action universe
US20220253717A1 (en) System and method for bringing inanimate characters to life
KR102259126B1 (en) Appartus and method for generating customizing image
KR102375736B1 (en) A Method and Apparatus for Artificial Intelligence Avatar Matching by 5G Communication-based Communication Pattern Analyzing
Lim Emotions, behaviour and belief regulation in an intelligent guide with attitude
Lee et al. Enabling human activity recognition with smartphone sensors in a mobile environment
JP4198643B2 (en) Status display method, mobile communication system, and server
CN117932165B (en) Personalized social method, system, electronic equipment and storage medium

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination