KR101112190B1 - Method for information service based real life in a cyber-space - Google Patents

Method for information service based real life in a cyber-space Download PDF

Info

Publication number
KR101112190B1
KR101112190B1 KR1020090057142A KR20090057142A KR101112190B1 KR 101112190 B1 KR101112190 B1 KR 101112190B1 KR 1020090057142 A KR1020090057142 A KR 1020090057142A KR 20090057142 A KR20090057142 A KR 20090057142A KR 101112190 B1 KR101112190 B1 KR 101112190B1
Authority
KR
South Korea
Prior art keywords
information
space
virtual
real
virtual world
Prior art date
Application number
KR1020090057142A
Other languages
Korean (ko)
Other versions
KR20100138556A (en
Inventor
권용진
Original Assignee
한국항공대학교산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한국항공대학교산학협력단 filed Critical 한국항공대학교산학협력단
Priority to KR1020090057142A priority Critical patent/KR101112190B1/en
Priority to PCT/KR2010/003649 priority patent/WO2010150991A2/en
Publication of KR20100138556A publication Critical patent/KR20100138556A/en
Application granted granted Critical
Publication of KR101112190B1 publication Critical patent/KR101112190B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/004Artificial life, i.e. computing arrangements simulating life
    • G06N3/006Artificial life, i.e. computing arrangements simulating life based on simulated virtual individual or collective life forms, e.g. social simulations or particle swarm optimisation [PSO]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/954Navigation, e.g. using categorised browsing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Biomedical Technology (AREA)
  • Computational Linguistics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Health & Medical Sciences (AREA)
  • Processing Or Creating Images (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

Disclosed is a method for providing information based on real life in a virtual space in which a storage of information based on time that is impossible in reality (hereinafter, abbreviated as "time axis information") can be stored and provided based on location information in a virtual space. do.

In the disclosed virtual space, a method for providing information based on real life includes: mapping a real space to a virtual world; Incorporating time-base information on the user's location in the virtual world and recording a movement; And performing a conversation in real time with an unspecified number in the virtual world.

Save time base information, virtual world, virtual space, time base information

Description

Method for information service based real life in a cyber-space}

The present invention relates to a method for providing information based on real life in a virtual space, and more particularly, to store information based on time that is impossible in reality (hereinafter, abbreviated as "time axis information") based on location information in a virtual space. The present invention relates to a method for providing information based on real life in a virtual space so that the information can be stored and provided as a result.

In general, it is almost impossible to record all the situations that an individual does in real life. In particular, the storage of time base information is impossible in reality.

Until now, a method of recording a real life of an individual has been recorded as necessary by using a recording device such as a camcorder and recorded as an image, or a person directly leaves his real life by hand or saves it as a file on a computer.

In this case, it is difficult to record all the real life, and it is very limited to record only the life which is a little special.

Humans often look back on their real life and recall their memories over time, but since they do not have historical information about their real life, they are limited only by their memory.

On the other hand, in the case of actually installing a symbol or the like in the travel or the real space, it is currently possible to evaluate the travel destination only by directly examining the travel destination, and the symbol may also be evaluated in accordance with the actual installation in the real space.

Therefore, it is required to establish the same environment as the real life in the virtual space, and this research is currently being conducted in many places.

Accordingly, the present invention has been proposed to solve various disadvantages caused by the inability to store the time base information on the existing real life as described above.

The problem to be solved by the present invention is a virtual space to store the information based on the time (hereinafter, referred to as "time axis information") based on the time impossible in the virtual space based on the location information in the virtual space Provides a method of providing information based on real life.

Another problem to be solved by the present invention is to induce the information of the real space and the virtual space to come and go naturally, by blurring the boundary between the real and virtual to naturally simulate the real space in the virtual world based on real life To provide information on how to provide information.

Another object of the present invention is to provide a method for providing information based on real life in a virtual space for providing a human-centered information service based on a user's lifelog and location information using information of a virtual world. .

Here, lifelog refers to a technology that can record / search all the information experienced in daily life by using a digital device such as video, voice, and picture. This means that you can recreate what you did, who you met, what you ate, and events and experiences with video, voice and photos.

In other words, Lifelog can automatically store your data on every device you own, including mobile phones, digital cameras, watches, MP3s, and nearby TVs, computers, navigation, sensors (RFID), It means the technology and services that can be gathered in one place and checked / processed again.

According to the present invention for solving the above problems, "real life-based information providing method in a virtual space",
Mapping the digital map to the virtual world by constructing 3D virtual space identical to the digital map (real space) through the U-MAP using Drawing eXchange Format (Parser) and terrain object converter. It is characterized by including.

delete

delete

delete

delete

delete

Combining time-axis information on the user's location obtained in the real world in the virtual world, and recording the movement,

It is characterized by combining the real space with the virtual world by using time axis information based on location information and direction information and personal profile.

The location information is obtained using GPS, and the direction information is obtained using an azimuth sensor.

In addition, the motion recording is characterized by embodying the character of the individual, and recognizes the direction information and implements the movement by reflecting the actual movement of the individual to the character in conjunction with the character.

In addition, the step of performing a conversation in real time with the unspecified number in the virtual world, but performs a voice conversation that can be grafted with OpenSim, the client is connected to the graphics server and the voice server at the same time to implement the conversation by transmitting the voice to the graphics Characterized in that.

According to the present invention, there is an advantage that can store the time axis information according to the actual life of the individual.

In addition, it is possible for an individual to evaluate the destination after performing a preliminary survey in a virtual space instead of directly exploring the destination, and after installing the symbol in the virtual space, the preliminary suitability of the actual installation of the symbol in the real space is evaluated. There is an advantage that can be seen.

In addition, there is an advantage to provide a human-centric information service based on the user's lifelog and location information using the information of the virtual world.

Hereinafter, described in detail with reference to the accompanying drawings a preferred embodiment of the present invention. In the following description, well-known functions or constructions are not described in detail since they would obscure the invention in unnecessary detail.

1 and 2 are schematic configuration diagrams of a system to which a real-life based information providing method is applied in a virtual space according to the present invention. The user recognition sensor 110, the integrated platform 120, the ambient browser 130, , A digital map database (DB) 140, an OpenSim DB 150, a personal profile DB 160, and a time base information database (DB) 170, a graphics server 210, and a voice server 220.

3 is a flowchart showing a "real-life based information providing method in a virtual space" according to the present invention, comprising: mapping a real space to a virtual world (S100); Combining the time axis information of the user's position obtained in the virtual world with the virtual world, recording a movement (S200), and performing a conversation in real time with an unspecified number in the virtual world (S300).

The "real-time information providing method in a virtual space" according to the present invention made as described above will be described in detail.

The user recognition sensor 110 is for checking the current user's position and direction in real life, and the user's position is easily determined by using a positioning technology such as GPS and the direction using an azimuth sensor. The location and direction are known.

When the location information and direction information of the user in the real world thus identified are provided to the unified platform 120, the unified platform 120 creates a virtual world that reflects reality as a previous step for storing time-axis information. Here, the virtual world uses DXF (Drawing eXchange Format) parser and terrain object converter, and now creates a virtual world using SecondLife and OpenSim. Map Access Platform) 120 is used (S100).

Here, DXF (Drawing eXchange Format) parser and terrain object converter mean a program that analyzes a digital map with DXF format of National Geographic Information Institute and converts it into data format of OpenSim, an open Web 3D server.

The reason for the need for the terrain object converter is that the digital map is automatically converted into OpenSim data, which has the advantage of automating the construction of the virtual space, which had to be relied on by hand. Since the present invention uses a digital map, there is no error between the real space and the virtual space object.

When mapping real space (numerical map) to virtual space 1: 1, use SecondLife client and OpenSim server.

Drawing eXchange Format (DXF), for example, is a file exchange format between AutoDesk's AutoCAD and a number of other CAD software, in standard ASCII format, with several consecutive groups of two lines representing spatial objects.

Looking at the DXF configuration from the perspective of the digital map, the HEADER section has the range that the map represents in TM coordinates, the CLASSES section is meaningless, the TABLES section defines styles, etc., and the BLOCKS section is a block used in the drawing. The ENTITIES section contains the shape information of the actual drawing, and the OBJECTS section is a meaningless section.

Looking at the shape information of the digital map in the DXF, all the shape information is contained in the ENTITIES section, Figure 5 shows the structure of the digital map in the DXF.

Looking at the data of the digital map data 140 and OpenSim DB 150 as follows.

As shown in FIG. 6, all data of the numerical map is composed of vertices, and a line is drawn between points in order to complete the map.

In contrast, the data of the OpenSim DB includes the center point x, y, z axis size and angle as basic information as shown in FIG. In addition to the basic information, there is additional information such as texture, digging, and bending.

Therefore, in order to convert numerical map (real space) data into OpenSim DB data (virtual world), as shown in FIG. 8, three-dimensional data having the size and angle of the center point x, y, and z axes are converted to planar data consisting of points. Should be converted to

Referring to FIG. 8 as an example, a line drawn between two points becomes an object, the middle point of two points is found to correspond to the center point of the object, and the length and angle between the two points are obtained. Corresponds to the x-axis size and z-axis angle of the object.

Next, the size of the x, y, z axis is arbitrarily determined to convert the planar data into stereoscopic data. For this purpose, the x and y axis size (thickness) of all objects should be assigned as 0.2m, the z axis size as 10m for building data, and the z axis size as 0.2m for road marking data such as roads. An example is shown in FIG. 9.

In addition, when converting the digital map data into OpenSim DB data, the layer having the property of the plane and the layer having the property of the line are distinguished. The properties of points / lines / faces are specified in the digital map layer table. In the case of a layer having a property of a line, a line drawn between a point and a point may be regarded as a single object, as shown in FIG. 10. In the case of a layer having a property of a plane, the first point is shown in FIG. 11. You must draw the line drawn between the and the last point as an object.

In addition, when converting the digital map data into OpenSim DB data, an angle calculation problem occurs. Do not use the angle (Euler angle) to assign 360 ° to each of the x, y, and z axes. An angle is expressed using a quadrant, which is an expanded form of a complex number, and a value of w is added in addition to x, y, and z, and is advantageous over Euler angle in applying a 3D rotation of an object. As the EulerToQuaternion class plays a role of converting the Euler angle obtained between two points into a quaternary number, an algorithm for performing angle calculation is disclosed in FIG. 12.

The result of converting the digital map into the 3D stereoscopic space of the virtual world through this process is shown in FIG. 13.

Next, a technique for stacking information obtained from reality in the virtual world formed as described above and a technique for recording movement are required (S200). That is, the time base information is stored in the virtual world.

14 to 16 illustrate mapping concepts for storing time axis information that cannot be stored in a real space in a virtual world, FIG. 14 is mapping time axis information of a building to a virtual space, and FIG. 15 shows the virtual world. In FIG. 16, a time axis information including a path traveled by a user is mapped to FIG. 16, and FIG. 16 is a conceptual diagram of providing a personalized personalized information service using location information based time axis information and a personal profile. As an example, the history browser 130 displays a record of the user's movement yesterday, and related information is displayed in the middle of the route.

In this case, when implementing a technique of accumulating information obtained from reality and a technique of recording motion, a configuration of a table in a database is required, a definition of position and direction recording is required, and a definition of other user-defined information recording is required. For example, a problem of data storage when a user takes an action, that is, when the user exchanges a business card with the other party in reality, a problem of data storage occurs when the user takes a picture in the real world. In order to solve the above problem of data storage, the recording is defined in advance.

Next, a concept of real time conversation with an unspecified number in a virtual space should be implemented (S300). You can meet and talk with other people as you move around in a virtual world that maps the real space, talk with the owner directly while shopping around the store, and talk while exploring where to go with friends in advance. In order to implement this, a voice conversation technology that can be combined with OpenSim should be implemented, and an example of voice chat implementation is illustrated in FIG. 17. By simultaneously connecting the graphics server 210 and the voice server 220 to the client and managing data at each server, voice chat can be easily implemented.

Through this process, the present invention integrates the time axis information into the virtual space by reflecting the movement of a real person to the character based on the location information and the direction information of the user acquired in the real space in the 3D virtual world. Will be.

The present invention is not limited to the above-described specific preferred embodiments, and various modifications can be made by any person having ordinary skill in the art without departing from the gist of the present invention claimed in the claims. Of course, such changes will fall within the scope of the claims.

1 is a system configuration diagram to which a real life based information providing method is applied in a virtual space according to the present invention.

2 is a configuration diagram of a system in which a graphics server and a voice server are added to FIG. 1;

3 is a flowchart illustrating a method for providing information based on real life in a virtual space according to the present invention.

4 is a conceptual diagram of mapping the real space to the virtual space in the present invention.

5 is a structural diagram of a numerical map in DXF.

6 is a data structure of a numerical map.

7 is a data structure diagram of OpenSim DB.

8 to 11 are process diagrams for converting numerical map data into OpenSim DB data.

12 is an algorithm for displaying the angle using the number of employees during the process of changing the digital map data to the OpenSim DB data.

13 is a result of converting the real space into 3D stereoscopic data of the virtual world.

14 to 16 are conceptual diagrams of mapping time-axis information that cannot be stored in the real space into a virtual world.

17 is an example implementation of voice chat.

Claims (7)

delete Mapping the digital map to the virtual world by constructing 3D virtual space identical to the digital map (real space) through the U-MAP using Drawing eXchange Format (Parser) and terrain object converter. Including, The step of combining the real space (numerical map) to the virtual world by using the time axis information based on the location information, direction information and personal profile in the integrated platform, Mapping the digital map to the virtual world, Convert planar data consisting of points to stereoscopic data with the size and angle of the center, x, y, and z axes, and consider the line drawn between the two points as an object. And map the numerical map to the virtual world by matching the center point of the object, and obtaining the length and angle between the two points, and corresponding to the x-axis size and the z-axis angle of the object. A method for providing information based on real life in a virtual space, wherein the angle is displayed using a quaternion. delete delete The virtual according to claim 2, wherein the step further comprises embodying a character of the individual, recognizing direction information, and then cooperating with the character to implement the movement by reflecting the actual movement of the individual to the character. Real life based information providing method in space. delete delete
KR1020090057142A 2009-06-25 2009-06-25 Method for information service based real life in a cyber-space KR101112190B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020090057142A KR101112190B1 (en) 2009-06-25 2009-06-25 Method for information service based real life in a cyber-space
PCT/KR2010/003649 WO2010150991A2 (en) 2009-06-25 2010-06-08 Method for providing reality-based information in a virtual space

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020090057142A KR101112190B1 (en) 2009-06-25 2009-06-25 Method for information service based real life in a cyber-space

Publications (2)

Publication Number Publication Date
KR20100138556A KR20100138556A (en) 2010-12-31
KR101112190B1 true KR101112190B1 (en) 2012-02-24

Family

ID=43387002

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020090057142A KR101112190B1 (en) 2009-06-25 2009-06-25 Method for information service based real life in a cyber-space

Country Status (2)

Country Link
KR (1) KR101112190B1 (en)
WO (1) WO2010150991A2 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101800661B1 (en) 2011-08-24 2017-11-24 삼성전자 주식회사 Method and apparatus for accessing location based service

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07325934A (en) * 1992-07-10 1995-12-12 Walt Disney Co:The Method and equipment for provision of graphics enhanced to virtual world
JP3410088B2 (en) * 1999-06-02 2003-05-26 富士通株式会社 Virtual communication space construction system corresponding to the sensed information in the real world
JP4163624B2 (en) * 2002-01-25 2008-10-08 株式会社岩根研究所 Automatic work system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07325934A (en) * 1992-07-10 1995-12-12 Walt Disney Co:The Method and equipment for provision of graphics enhanced to virtual world
JP3410088B2 (en) * 1999-06-02 2003-05-26 富士通株式会社 Virtual communication space construction system corresponding to the sensed information in the real world
JP4163624B2 (en) * 2002-01-25 2008-10-08 株式会社岩根研究所 Automatic work system

Also Published As

Publication number Publication date
KR20100138556A (en) 2010-12-31
WO2010150991A3 (en) 2011-03-31
WO2010150991A2 (en) 2010-12-29

Similar Documents

Publication Publication Date Title
US11368557B2 (en) Platform for constructing and consuming realm and object feature clouds
KR101876481B1 (en) View dependent techniques to determine user interest in a feature in a 3d application
JP6064544B2 (en) Image processing apparatus, image processing method, program, and terminal device
Nurminen m-LOMA-a mobile 3D city map
EP2572336B1 (en) Mobile device, server arrangement and method for augmented reality applications
US20120256917A1 (en) Augmented Reality System
US20140300637A1 (en) Method and apparatus for determining camera location information and/or camera pose information according to a global coordinate system
Zlatanova Augmented reality technology
EP2056256A2 (en) System and method for revealing occluded objects in an image dataset
US20110279478A1 (en) Virtual Tagging Method and System
US20120105440A1 (en) Augmented Reality System
JP5401387B2 (en) Map image processing apparatus, map image processing method, and map image processing program
CN109920055A (en) Construction method, device and the electronic equipment of 3D vision map
CN105323252A (en) Method and system for realizing interaction based on augmented reality technology and terminal
CN103003847A (en) Method and apparatus for rendering a location-based user interface
KR20230079157A (en) Augmented Reality Content Creators for Identifying Geolocations
KR20230076843A (en) Augmented reality content creators for browsing destinations
KR20230076849A (en) Augmented reality content creator for destination activities
CN109242966A (en) A kind of 3D panorama model modeling method based on laser point cloud data
Asraf et al. Mobile application outdoor navigation using location-based augmented reality (AR)
KR101112190B1 (en) Method for information service based real life in a cyber-space
KR20230121952A (en) Method for information service based real life in a cyber-space
CN108765275A (en) The method for building up and device of three-dimensional map
CN111476873B (en) Mobile phone virtual graffiti method based on augmented reality
CN114862997A (en) Image rendering method and apparatus, medium, and computer device

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
AMND Amendment
E601 Decision to refuse application
J201 Request for trial against refusal decision
AMND Amendment
B701 Decision to grant
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20150115

Year of fee payment: 4

FPAY Annual fee payment

Payment date: 20170103

Year of fee payment: 6

FPAY Annual fee payment

Payment date: 20180129

Year of fee payment: 7

FPAY Annual fee payment

Payment date: 20190729

Year of fee payment: 8