KR20130068593A - Metaverse platform for fusing actual feeling and method for providing service using the same - Google Patents
Metaverse platform for fusing actual feeling and method for providing service using the same Download PDFInfo
- Publication number
- KR20130068593A KR20130068593A KR1020110135881A KR20110135881A KR20130068593A KR 20130068593 A KR20130068593 A KR 20130068593A KR 1020110135881 A KR1020110135881 A KR 1020110135881A KR 20110135881 A KR20110135881 A KR 20110135881A KR 20130068593 A KR20130068593 A KR 20130068593A
- Authority
- KR
- South Korea
- Prior art keywords
- behavior
- metaverse
- user
- sensory
- information
- Prior art date
Links
- 238000000034 method Methods 0.000 title abstract description 29
- 238000012545 processing Methods 0.000 claims abstract description 18
- 230000001360 synchronised effect Effects 0.000 claims abstract description 17
- 230000004927 fusion Effects 0.000 claims description 21
- 230000001186 cumulative effect Effects 0.000 claims description 2
- 238000005516 engineering process Methods 0.000 abstract description 13
- 230000006399 behavior Effects 0.000 description 99
- 230000001953 sensory effect Effects 0.000 description 73
- 238000004891 communication Methods 0.000 description 28
- 238000010586 diagram Methods 0.000 description 8
- 238000012706 support-vector machine Methods 0.000 description 8
- 230000000694 effects Effects 0.000 description 6
- 238000004422 calculation algorithm Methods 0.000 description 4
- 238000007418 data mining Methods 0.000 description 3
- 238000000605 extraction Methods 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000002708 enhancing effect Effects 0.000 description 2
- 230000003203 everyday effect Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000002452 interceptive effect Effects 0.000 description 2
- 230000006855 networking Effects 0.000 description 2
- 230000000737 periodic effect Effects 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 230000003542 behavioural effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 150000001875 compounds Chemical class 0.000 description 1
- 230000005358 geomagnetic field Effects 0.000 description 1
- 238000007654 immersion Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000003012 network analysis Methods 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Business, Economics & Management (AREA)
- Theoretical Computer Science (AREA)
- Tourism & Hospitality (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Marketing (AREA)
- Primary Health Care (AREA)
- Strategic Management (AREA)
- Human Resources & Organizations (AREA)
- General Business, Economics & Management (AREA)
- General Health & Medical Sciences (AREA)
- Economics (AREA)
- Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Information Transfer Between Computers (AREA)
Abstract
Description
The present invention relates to a sensory convergence metaverse platform device and a service providing method using the same, and more specifically, to provide a realistic entertainment game service, the user's actions are recognized and reflected in the service, and social networking of multiple users is established. The present invention relates to a sensory convergence metaverse platform device and a service providing method using the same.
Recently, Second Life, Nintendo, Google, Microsoft, etc. have been actively attempting to induce game users' interest by effectively expressing and inferring relationships between game users.
In addition, realistic console games are being released, and in the near future, such games are expected to come online so that a plurality of users interact in one virtual space.
In addition, the importance of recognition of schema for efficiently expressing social relations between game users and providing user convenience of games through social inference is emerging, and mixed reality-based user-centered realistic type in which reality and virtual can coexist. The demand for entertainment services is skyrocketing.
Platforms to which unit technology including the prior art Massively Multiplayer Online Game (MMOG) server technology are applied are difficult to apply to realistic converged entertainment services, and there is a problem in that they cannot support convergent services such as user behavior recognition and social networking.
In addition, there are some technologies that use the acceleration and angular velocity-based sensors to recognize the user's behavior and apply it to the application for the realistic game, but since they simply reflect the user's movement in the game, they use all of the five senses There is a problem that can not provide services.
The present invention has been proposed to solve the above-mentioned conventional problems, and reflects user's behavior data to an avatar in the metaverse engine, which is a virtual space, and creates a social network between users. An object of the present invention is to provide a device and a service providing method using the same.
In order to achieve the above object, a sensory convergence metaverse platform apparatus according to an embodiment of the present invention includes: a sensing information receiver configured to receive sensing information that detects a user's behavior from a plurality of clients; A processing unit for recognizing a user's behavior by classifying it into a synchronous behavior and an asynchronous behavior based on the received sensing information; And if the perceived user behavior is a synchronous behavior, it is directly reflected in the avatar's movement in the metaverse virtual space. If the perceived user behavior is asynchronous, the cumulative asynchronous behavior is directly reflected in the metaverse virtual space's movement. It includes a metaverse management unit.
The apparatus may further include a social network processor configured to generate a social network between users based on a lifelog including user information and log information of a connected client.
The apparatus may further include a social network processor that detects a communication partner and provides the connected client to the connected client based on the social network generated using the lifelog including the user information and log information of the connected client.
The social network processor detects a communication partner by calculating an intimacy of a client connected with other clients in the metaverse virtual space.
The social network processor detects a communication partner by calculating an intimacy of a client connected with other clients in the metaverse virtual space.
In order to achieve the above object, a service providing method using a sensory convergence metaverse platform device according to an embodiment of the present invention includes: receiving, by a sensing information receiver, sensing information that senses user behavior from a connected client; ; Recognizing, by the behavior recognition processor, the user behavior based on the received sensing information by classifying the user behavior into a synchronous behavior and an asynchronous behavior; Controlling by the metaverse manager to directly reflect the movement of the avatar in the metaverse virtual space if the recognized user behavior is a synchronous behavior; And controlling, by the metaverse manager, the accumulated asynchronous behavior by directly reflecting the movement of the avatar in the metaverse virtual space if the recognized user behavior is an asynchronous behavior.
Generating, by the social network processing unit, a social network between users based on a lifelog including user information and log information of a connected client.
The method may further include detecting, by the social network processing unit, the communication partner based on the social network generated using the lifelog including the user information and log information of the connected client to the connected client.
In the step of detecting and providing the communication partner to the connected client, the social network processor detects the communication partner by calculating the intimacy of the client connected with other clients in the metaverse virtual space.
In the step of detecting and providing the communication partner to the connected client, the social network processor detects the communication partner by calculating the intimacy of the client connected with other clients in the metaverse virtual space.
According to the present invention, the sensory convergence metaverse platform device and the service providing method using the same can be applied to various fields based on collected behavior information while enabling the recognition of sensory sensor technology and wired / wireless network technology. It can support not only recognition but also asynchronous behavior.
In addition, the sensory convergence metaverse platform device and the method of providing the service using the same method analyzes the behavior of many anonymous persons simultaneously through distributed inference, rather than centralized inference, to control the behavior of the metaverse avatar, so that the user feels and realizes It has the effect of enhancing the impression.
In addition, the sensory convergence metaverse platform device and the service providing method using the same are utilized in analyzing the user's disposition, behavior pattern, and lifestyle through the data mining technique, and the analyzed user's disposition is similar to the user. There is an effect that can be utilized as a parameter, such as social network configuration, individual individual services.
In addition, the sensory convergence metaverse platform device and the service providing method using the same establish a social network of each user by analyzing log information of users in the metaverse environment, and establish a communication partner of the user based on the established social network. By recommending, you can support interactive social networks and promote user-centric engagement. Supporting the next generation user community based on social networks in the virtual space, there is an effect that can be applied to the development of context-aware technology based on this.
1 is a block diagram illustrating a sensory fusion metaverse platform device according to an embodiment of the present invention.
2 is a view for explaining an application example of the sensory fusion metaverse platform device according to an embodiment of the present invention.
3 is a block diagram illustrating a configuration of a sensory fusion metaverse platform device according to an embodiment of the present invention.
4 and 5 are block diagrams for explaining the action recognition processor of FIG.
6 is a block diagram illustrating a social network processing unit of FIG. 3.
7 is a flowchart illustrating a service providing method using a sensory convergence metaverse platform device according to a first embodiment of the present invention.
8 is a flowchart illustrating a service providing method using a sensory convergence metaverse platform device according to a second embodiment of the present invention.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings in order to facilitate a person skilled in the art to easily carry out the technical idea of the present invention. . In the drawings, the same reference numerals are used to designate the same or similar components throughout the drawings. In the following description of the present invention, a detailed description of known functions and configurations incorporated herein will be omitted when it may make the subject matter of the present invention rather unclear.
First, before describing the embodiments of the present invention will be described the features of the present invention. Metaverse is a compound word of meta which means processing and abstraction and universe which means real world, and the virtual world such as web and internet is absorbed into the real world. The present invention operates in a metaverse engine environment that projects the behavior of a user in the real world onto an avatar.
In the present invention, the user's actions are classified into asynchronous and synchronous actions.
Whether it is asynchronous behavior or not, the user collects sensing data from everyday life and analyzes the collected data to infer everyday life. Inference results through asynchronous behavior recognition are used to analyze the user's disposition, behavior pattern, and lifestyle.
In order to overcome the limitations of the game through the existing simple input device, the synchronous behavior recognizes the behavior synchronously by applying the behavior recognition technology and improves the reality and immersion degree by reflecting the user's behavior information in the game. . Attach the multi-sensor to the user's arms, legs, waist, etc. to collect behavior information and posture by integrating information that can be collected from each sensor (3-axis acceleration, 3-axis angular velocity, 3-axis geomagnetic field, etc.) Recognize inferences and actions.
In the present invention, the social network middleware configures the user's social network based on the user's log information stored in the metaverse server, and recommends a communication partner based on the social network.
In order to build the social network, OWL (Web Ontology Language) is used to express user information in a metaverse and form a social relationship between users.
In order to match appropriate communication partners based on social networks, we calculate the intimacy between nodes constituting the network and define rules for recommending the partners.
The social network middleware automatically builds metaverse user information into a social network, receives user log information of the metaverse, and delivers the relative information recommended through the recommendation algorithm to the metaverse.
Hereinafter, the sensory fusion metaverse platform device according to an embodiment of the present invention will be described in detail with reference to the accompanying drawings. 1 is a block diagram illustrating a sensory fusion metaverse platform device according to an embodiment of the present invention. 2 is a view for explaining an application example of the sensory fusion metaverse platform device according to an embodiment of the present invention. 3 is a block diagram illustrating a configuration of a sensory fusion metaverse platform device according to an embodiment of the present invention. 4 and 5 are block diagrams illustrating the behavior recognition processor of FIG. 3, and FIG. 6 is a block diagram illustrating the social network processor of FIG. 3.
As shown in FIG. 1, the sensory convergence
The
The
The sensory convergence
The sensory convergence
The sensory convergence
To this end, as shown in FIG. 3, the sensory convergence
The
The
The
To this end, as shown in FIG. 4, the
At this time, as shown in Figure 5, the
The
The
The user
The metaverse manager 360 manages a metaverse, which is a virtual space in which an avatar on which the real world is projected acts. That is, the metaverse manager 360 applies the behavior recognition information of the
The
Hereinafter, a service providing method using a sensory convergence metaverse platform device according to a first embodiment of the present invention will be described in detail with reference to the accompanying drawings. 7 is a flowchart illustrating a service providing method using a sensory convergence metaverse platform device according to a first embodiment of the present invention. FIG. 7 illustrates a method of applying a user's behavior to an avatar in a metaverse virtual space using a sensory convergence metaverse platform device.
First, a user connects to the sensory fusion
When receiving the sensing information from the client 100 (S120; YES), the sensory convergence
If it is divided into the synchronous behavior information (S140; Yes), the sensory convergence
When the sensory convergence
The sensory convergence
The sensory convergence
The sensory convergence
Hereinafter, a service providing method using a sensory convergence metaverse platform device according to a second embodiment of the present invention will be described in detail with reference to the accompanying drawings. 8 is a flowchart illustrating a service providing method using a sensory convergence metaverse platform device according to a second embodiment of the present invention. FIG. 8 is a description of a social network based match making method using a sensory convergence metaverse platform device. FIG.
First, the user connects to the sensory fusion
The sensory convergence
The sensory fusion
The sensory convergence
When the user requests a match making at a specific time while controlling the avatar in the virtual space of the metaverse through the client 100 (S340; YES), the sensory convergence
The sensory fusion
The sensory fusion
As described above, the sensory convergence metaverse platform device and the service providing method using the same can be applied to various fields based on collected behavior information while enabling the recognition of sensory sensor technology and wired / wireless network technology. It can support not only recognition but also asynchronous behavior.
In addition, the sensory convergence metaverse platform device and the method of providing the service using the same method analyzes the behavior of many anonymous persons simultaneously through distributed inference, rather than centralized inference, to control the behavior of the metaverse avatar, so that the user feels and realizes It has the effect of enhancing the impression.
In addition, the sensory convergence metaverse platform device and the service providing method using the same are utilized in analyzing the user's disposition, behavior pattern, and lifestyle through the data mining technique. There is an effect that can be utilized as a parameter, such as social network configuration, individual individual services.
In addition, the sensory convergence metaverse platform device and the service providing method using the same establish a social network of each user by analyzing log information of users in the metaverse environment, and establish a communication partner of the user based on the established social network. By recommending, you can support interactive social networks and promote user-centric engagement. Supporting the next generation user community based on social networks in the virtual space, there is an effect that can be applied to the development of context-aware technology based on this.
While the present invention has been described in connection with what is presently considered to be practical exemplary embodiments, it is to be understood that the invention is not limited to the disclosed embodiments, but many variations and modifications may be made without departing from the scope of the present invention. It will be understood that the invention may be practiced.
100: Client 200: Content Server
300: sensory convergence metaverse platform device 310: sensing information receiver
320: user behavior storage unit 330: behavior recognition processing unit
331: attitude estimation module 332: communication module
333: support module 334: filter module
335: leveling feature extraction module 336: pattern classification module
337: wrapper module 338: model storage module
339: SVM module 340: social network processing unit
342: Match processing module 344: SNS storage module
346: inference module 350: user information storage unit
360: metaverse management unit 370: content provider
400: other users
Claims (1)
A processing unit for recognizing a user's behavior by classifying it into a synchronous behavior and an asynchronous behavior based on the received sensing information; And
If the perceived user behavior is a synchronous behavior, it is directly reflected and controlled in the movement of the avatar in the metaverse virtual space. If the perceived user behavior is asynchronous, the cumulative asynchronous behavior is directly reflected in the avatar movement in the metaverse virtual space. Realistic fusion metaverse platform device comprising a metaverse management unit for controlling.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020110135881A KR20130068593A (en) | 2011-12-15 | 2011-12-15 | Metaverse platform for fusing actual feeling and method for providing service using the same |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020110135881A KR20130068593A (en) | 2011-12-15 | 2011-12-15 | Metaverse platform for fusing actual feeling and method for providing service using the same |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20130068593A true KR20130068593A (en) | 2013-06-26 |
Family
ID=48864185
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020110135881A KR20130068593A (en) | 2011-12-15 | 2011-12-15 | Metaverse platform for fusing actual feeling and method for providing service using the same |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20130068593A (en) |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20180049274A (en) * | 2014-03-25 | 2018-05-10 | 이베이 인크. | Data mesh platform |
KR102412142B1 (en) * | 2022-01-10 | 2022-06-22 | 인트인 주식회사 | Method for pattern-analyzing behavior of avatar in metaverse based on deep learning |
KR102428990B1 (en) * | 2021-09-08 | 2022-08-03 | 장혁 | User-customized content recommendation system and method |
KR20220146298A (en) | 2021-04-23 | 2022-11-01 | 이상열 | Metaverse system that provides an economic operating system to a fusion space where reality and virtualization are fused |
KR20220165214A (en) | 2021-06-07 | 2022-12-14 | (주) 애니펜 | Method, system, and non-transitory computer-readable recording medium for providing contents |
WO2023017890A1 (en) * | 2021-08-11 | 2023-02-16 | 한국전자기술연구원 | Method for providing metaverse, apparatus for providing metaverse, and system for providing metaverse |
KR20230027474A (en) | 2021-08-19 | 2023-02-28 | 주식회사 사운드파인트리 | Apparatus for Operating Hair Shop by Using Metaverse and Driving Method Thereof |
KR20230060916A (en) | 2021-10-28 | 2023-05-08 | (주) 텔로스 | Realistic contents virtual reality platform server, realistic contents virtual reality system including the same, and method of operating realistic contents virtual reality platform |
KR20230096695A (en) | 2021-12-23 | 2023-06-30 | (주)가시 | Metaverse platform system based on sensor and location information |
KR20230102674A (en) | 2021-12-30 | 2023-07-07 | 박재홍 | Ynchronized system and method using nfc tag |
KR20230102673A (en) | 2021-12-30 | 2023-07-07 | 박재홍 | Synchronized system and method between real world and metaverse |
KR20230109829A (en) | 2022-01-13 | 2023-07-21 | 아이작에스엔씨 주식회사 | CellCity service system |
KR20230112388A (en) * | 2022-01-20 | 2023-07-27 | 주식회사 유비온 | Method and system for maintaining avatar consistency among diverse metaverse platforms |
KR20230120834A (en) | 2022-02-10 | 2023-08-17 | (주)가시 | Business conference system using the metaverse gather town |
KR20230134284A (en) | 2022-03-14 | 2023-09-21 | 주식회사 비디 | Metaverse resource management platform device |
KR102615263B1 (en) * | 2022-07-29 | 2023-12-19 | (주)엣지디엑스 | Metaverse system |
KR20240000783A (en) | 2022-06-24 | 2024-01-03 | 넥스트스토리 주식회사 | System and Method for join-now game service based on metaverse |
KR20240062575A (en) * | 2022-11-02 | 2024-05-09 | 메타브릿지 주식회사 | System for providing Metaverse flatform bridging real economy |
WO2024128417A1 (en) * | 2022-12-15 | 2024-06-20 | 주식회사 맥스트 | Method for generating metaverse space and teleporting avatar in metaverse space |
KR20240093031A (en) | 2022-12-15 | 2024-06-24 | 주식회사 마크애니 | Method for providing service differentially based on decentralized identifier |
-
2011
- 2011-12-15 KR KR1020110135881A patent/KR20130068593A/en not_active Application Discontinuation
Cited By (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11657443B2 (en) | 2014-03-25 | 2023-05-23 | Ebay Inc. | Data mesh based environmental augmentation |
US10719866B2 (en) | 2014-03-25 | 2020-07-21 | Ebay Inc. | Complementary activity based on availability of functionality |
US11100561B2 (en) | 2014-03-25 | 2021-08-24 | Ebay Inc. | Data mesh visualization |
US11120492B2 (en) | 2014-03-25 | 2021-09-14 | Ebay Inc. | Device ancillary activity |
US11210723B2 (en) | 2014-03-25 | 2021-12-28 | Ebay Inc. | Data mesh based environmental augmentation |
US12033204B2 (en) | 2014-03-25 | 2024-07-09 | Ebay Inc. | Device ancillary activity |
US11900437B2 (en) | 2014-03-25 | 2024-02-13 | Ebay Inc. | Data mesh based environmental augmentation |
KR20180049274A (en) * | 2014-03-25 | 2018-05-10 | 이베이 인크. | Data mesh platform |
US11810178B2 (en) | 2014-03-25 | 2023-11-07 | Ebay Inc. | Data mesh visualization |
KR20220146298A (en) | 2021-04-23 | 2022-11-01 | 이상열 | Metaverse system that provides an economic operating system to a fusion space where reality and virtualization are fused |
KR20220165214A (en) | 2021-06-07 | 2022-12-14 | (주) 애니펜 | Method, system, and non-transitory computer-readable recording medium for providing contents |
KR20230024451A (en) * | 2021-08-11 | 2023-02-21 | 한국전자기술연구원 | Method, apparatus, and system for providing metaverse |
WO2023017890A1 (en) * | 2021-08-11 | 2023-02-16 | 한국전자기술연구원 | Method for providing metaverse, apparatus for providing metaverse, and system for providing metaverse |
KR20230027474A (en) | 2021-08-19 | 2023-02-28 | 주식회사 사운드파인트리 | Apparatus for Operating Hair Shop by Using Metaverse and Driving Method Thereof |
KR102428990B1 (en) * | 2021-09-08 | 2022-08-03 | 장혁 | User-customized content recommendation system and method |
KR20230060916A (en) | 2021-10-28 | 2023-05-08 | (주) 텔로스 | Realistic contents virtual reality platform server, realistic contents virtual reality system including the same, and method of operating realistic contents virtual reality platform |
KR20230096695A (en) | 2021-12-23 | 2023-06-30 | (주)가시 | Metaverse platform system based on sensor and location information |
KR20230102673A (en) | 2021-12-30 | 2023-07-07 | 박재홍 | Synchronized system and method between real world and metaverse |
KR20230102674A (en) | 2021-12-30 | 2023-07-07 | 박재홍 | Ynchronized system and method using nfc tag |
KR102412142B1 (en) * | 2022-01-10 | 2022-06-22 | 인트인 주식회사 | Method for pattern-analyzing behavior of avatar in metaverse based on deep learning |
KR20230109829A (en) | 2022-01-13 | 2023-07-21 | 아이작에스엔씨 주식회사 | CellCity service system |
KR20230112388A (en) * | 2022-01-20 | 2023-07-27 | 주식회사 유비온 | Method and system for maintaining avatar consistency among diverse metaverse platforms |
KR20230120834A (en) | 2022-02-10 | 2023-08-17 | (주)가시 | Business conference system using the metaverse gather town |
KR20230134284A (en) | 2022-03-14 | 2023-09-21 | 주식회사 비디 | Metaverse resource management platform device |
KR20240000783A (en) | 2022-06-24 | 2024-01-03 | 넥스트스토리 주식회사 | System and Method for join-now game service based on metaverse |
KR102615263B1 (en) * | 2022-07-29 | 2023-12-19 | (주)엣지디엑스 | Metaverse system |
KR20240062575A (en) * | 2022-11-02 | 2024-05-09 | 메타브릿지 주식회사 | System for providing Metaverse flatform bridging real economy |
WO2024096501A1 (en) * | 2022-11-02 | 2024-05-10 | 메타브릿지 주식회사 | System for providing real economy-linked metaverse platform |
WO2024128417A1 (en) * | 2022-12-15 | 2024-06-20 | 주식회사 맥스트 | Method for generating metaverse space and teleporting avatar in metaverse space |
KR20240093031A (en) | 2022-12-15 | 2024-06-24 | 주식회사 마크애니 | Method for providing service differentially based on decentralized identifier |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR20130068593A (en) | Metaverse platform for fusing actual feeling and method for providing service using the same | |
Xu et al. | A full dive into realizing the edge-enabled metaverse: Visions, enabling technologies, and challenges | |
US11704939B2 (en) | Liveness detection | |
CN105431813B (en) | It is acted based on biometric identity home subscriber | |
US20210146255A1 (en) | Emoji-based communications derived from facial features during game play | |
US10049287B2 (en) | Computerized system and method for determining authenticity of users via facial recognition | |
US9965675B2 (en) | Using virtual reality for behavioral analysis | |
EP2297649B1 (en) | Providing access to virtual spaces that are associated with physical analogues in the real world | |
US9560094B2 (en) | System and method for identifying and analyzing personal context of a user | |
CN102918518B (en) | Individual characteristics profile data based on cloud | |
US20150304454A1 (en) | System and method for providing virtual spaces for access by users via the web | |
WO2012140562A1 (en) | System and method for developing evolving online profiles | |
EP2798517A2 (en) | A method and system for creating an intelligent social network between plurality of devices | |
Jebbar et al. | A fog-based architecture for remote phobia treatment | |
Ranathunga et al. | Interfacing a cognitive agent platform with second life | |
Kim et al. | Virtual world control system using sensed information and adaptation engine | |
Meng et al. | De-anonymization attacks on metaverse | |
CN116383494A (en) | Information resource pushing method, device and system based on live-action universe | |
US20220253717A1 (en) | System and method for bringing inanimate characters to life | |
KR102259126B1 (en) | Appartus and method for generating customizing image | |
KR102375736B1 (en) | A Method and Apparatus for Artificial Intelligence Avatar Matching by 5G Communication-based Communication Pattern Analyzing | |
Lim | Emotions, behaviour and belief regulation in an intelligent guide with attitude | |
Lee et al. | Enabling human activity recognition with smartphone sensors in a mobile environment | |
JP4198643B2 (en) | Status display method, mobile communication system, and server | |
CN117932165B (en) | Personalized social method, system, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WITN | Withdrawal due to no request for examination |