KR101781471B1 - Simulation system for artillery training - Google Patents

Simulation system for artillery training Download PDF

Info

Publication number
KR101781471B1
KR101781471B1 KR1020150146409A KR20150146409A KR101781471B1 KR 101781471 B1 KR101781471 B1 KR 101781471B1 KR 1020150146409 A KR1020150146409 A KR 1020150146409A KR 20150146409 A KR20150146409 A KR 20150146409A KR 101781471 B1 KR101781471 B1 KR 101781471B1
Authority
KR
South Korea
Prior art keywords
image
simulation
binoculars
target image
unit
Prior art date
Application number
KR1020150146409A
Other languages
Korean (ko)
Other versions
KR20170046288A (en
Inventor
이종필
Original Assignee
주식회사 필텍
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 필텍 filed Critical 주식회사 필텍
Priority to KR1020150146409A priority Critical patent/KR101781471B1/en
Publication of KR20170046288A publication Critical patent/KR20170046288A/en
Application granted granted Critical
Publication of KR101781471B1 publication Critical patent/KR101781471B1/en

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/003Simulators for teaching or training purposes for military purposes and tactics
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41AFUNCTIONAL FEATURES OR DETAILS COMMON TO BOTH SMALLARMS AND ORDNANCE, e.g. CANNONS; MOUNTINGS FOR SMALLARMS OR ORDNANCE
    • F41A33/00Adaptations for training; Gun simulators
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/26Teaching or practice apparatus for gun-aiming or gun-laying

Abstract

The present invention relates to a simulation system for an artillery training, in which a screen for projecting a terrain object image, an image projecting device for projecting the terrain object image onto a screen, at least one topographical object image, The present position and attitude information is recognized by recognizing markers and markers included in a simulation server, a screen, or a feature image to generate a target image, which is a predetermined region of the feature image, The simulation server receives the current position and attitude information transmitted from the simulation binoculars and transmits the position and the position information of the simulation binoculars to the simulation server, And generates a target image It relates to shelling training simulation system for transmitting a target image to simulate binoculars.

Description

[0001] SIMULATION SYSTEM FOR ARTILLERY TRAINING [0002]

Field of the Invention [0002] The present invention relates to a simulation system for an artillery training, and more particularly to a simulation system for artillery training capable of transmitting a virtual image of a specific area to a simulation binoculars.

During the engagement, mortar firearms, artillery, and self-propelled artillery fire targets on targets by collaborating with observatories, fire command posts, and launchers.

In the case of shell training, it is required to develop a shell training system using simulation technique to solve such a problem, since each time of launch training such as a pilot takes a considerable cost.

In an artillery training system, the station should calculate the position of the target in real time and provide it to the fire command post and launcher.

The conventional artillery training system remains to calculate the coordinates with the naked eye from the simulation image. In particular, observers in an observation station must calculate the position of the target urgently according to a similar situation, but there is a problem that an error occurs in the actual training because the simulation image is monotonous.

In addition, the training effect is insignificant due to a large error between the actual shooting training and the simulation training in the artillery training on a large scale.

The object of the present invention is to provide a simulation system for a bombardment training in which a target image projected on a screen is set at a different angle depending on a user's position and the corresponding image is independently transmitted to the binoculars so that the user can calculate the position of the target .

According to an aspect of the present invention, there is provided a simulation system for an artillery training, comprising: a screen on which a feature art image is projected; An image projecting device for projecting the feature object image onto the screen; A simulation server for storing the at least one feature artifact image, transmitting the selected feature artifact image to the image projection apparatus, and generating a target image, which is a predetermined region of the feature artifact image; Recognizing the landmarks and the landmarks included in the screen or the landmark image to calculate current position and attitude information, transmitting the calculated current position and attitude information to the simulation server, and enlarging and displaying the target image Wherein the simulation server receives the current position and attitude information transmitted in the simulation binoculars and generates the target image recognizing the position of the simulation binoculars and the feature object image at the position of the simulation binoculars, And transmitting the target image to the simulation binoculars.

The simulation binoculars include a body case having an eyepiece section adjacent to a left eye and a right eye of a user, respectively; A display unit that is inserted into the body case and displays the target image; A lens unit that enlarges a target image displayed on the display unit and is installed so that the user can identify the target image; An eraser for recognizing the mark on the outside of the body case end; And a button formed on the outside of the main body case so as to be pressed by a user when the mark and the mark match; And a sensor unit for sensing the button operation and generating the current position and attitude information.

The sensor unit includes at least one of a gyro sensor, an acceleration sensor, and a geomagnetic sensor. The sensor unit recognizes four squares forming a quadrangle to generate plane coordinates, calculates an angle between the plane coordinates, Lt; / RTI >

The simulation server sets an area of the target image on the plane coordinates, calculates the attitude information through the angle, inverses the position of the simulation binoculars to generate a corresponding target image, Can be provided to the simulated binoculars.

Wherein the lens unit comprises: a relay lens assembly having a plurality of lenses arranged adjacent to the display unit and enlarging an image of the display unit; And an eyepiece lens assembly disposed between the eyepiece unit and the relay lens assembly and having a plurality of lenses arranged to adjust an image distance through the relay lens assembly.

The display unit may include at least one of an LCD and an OLED.

The display unit may include two display devices for displaying images on the left and right eyes of the user.

At least one of the two display elements may shift and display an image transmitted on the basis of the left or right eye.

At least one of the two display elements may be shifted based on the left or right eye.

The two display elements can receive the same image from the simulation server via a single channel HDMI.

The simulation system for the training of firearms according to the embodiment of the present invention can provide a target image for the position set by the simulation binoculars in the feature artifact projection. The facial image can enhance the training effect by providing an image corresponding to the position of the corresponding simulated binoculars.

In addition, the simulation system for an artillery training according to an embodiment of the present invention can provide a realistic environment by providing a three-dimensional image to a user (training bottle) through simulation binoculars.

BRIEF DESCRIPTION OF THE DRAWINGS Fig. 1 is a system diagram showing a simulation system for an artillery exercise according to an embodiment of the present invention; Fig.
2 is an exploded view showing the structure of the simulation binoculars shown in Fig.
3 is a cross-sectional view of the simulation binoculars of Fig. 2 cut away. Fig.
FIGS. 4 and 5 are diagrams for explaining the measurement of the current position and attitude information in the sensor unit. FIG.

Hereinafter, the description of the present invention with reference to the drawings is not limited to a specific embodiment, and various transformations can be applied and various embodiments can be made. It is to be understood that the following description covers all changes, equivalents, and alternatives falling within the spirit and scope of the present invention.

In the following description, the terms first, second, and the like are used to describe various components and are not limited to their own meaning, and are used only for the purpose of distinguishing one component from another component.

Like reference numerals used throughout the specification denote like elements.

As used herein, the singular forms "a", "an" and "the" include plural referents unless the context clearly dictates otherwise. It is also to be understood that the terms " comprising, "" comprising, "or" having ", and the like are intended to designate the presence of stated features, integers, And should not be construed to preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.

Hereinafter, embodiments of the present invention will be described in detail with reference to FIGS. 1 to 5 attached herewith.

1 is a system diagram showing a simulation system for a bombardment training according to an embodiment of the present invention.

Referring to FIG. 1, a simulation system for an artillery training according to an embodiment of the present invention may include a screen 10, a simulation server 100, an image projection apparatus 50, and a simulation binoculars 200.

Specifically, the screen 10 can display the feature art images transmitted from the simulation server 100. [ At this time, the feature image may include an image related to the terrain such as a position where the user (or a trainee) is bombarded. The landmark image may be an actual photographed image or a virtual image.

The screen 10 may be provided with at least four indicia 20. In the embodiment of the present invention, five screens may be provided at the top, five at the center, and five at the bottom so that the screen 10 can be divided into three areas. The marking 20 may be a grid-like marking of a specific color so that the marking can be easily recognized by the user.

In the embodiment of the present invention, the screen 10 is provided with markers, but the present invention is not limited thereto, and markers may be displayed on the feature image.

The image projection apparatus 50 projects the feature art images received from the simulation server 100 on the screen 10. A normal projector or the like can be used as the image projection apparatus 50. [

The simulation server 100 stores a plurality of feature art images and transmits the selected feature art images to the image projection apparatus 50 according to the administrator's selection. The simulation server 100 may select a target image from the same geographical feature image that is supplied to the image projection apparatus 50 and transmit the selected target image to the simulation binoculars 200. The selection of the target image will be described later.

The simulation server 100 may receive the current position and attitude information of the simulation binoculars 200 and generate a target image. That is, the simulation server 100 may extract the target image to be displayed in the simulation binoculars 200 and transmit the target image.

For this purpose, the simulation server 100 may include a functional unit such as a storage unit (not shown) and an image extracting unit (not shown).

At this time, the target binoculars 200 may display different target images at various positions on the front side of the screen 10. That is, they are observed differently from the viewpoint of users at different positions with respect to the same area. In order to transmit such a point to the target image of the simulation binoculars 200, the simulation server 100 must transmit a different target image for each point of view of the selected region through the position and attitude information of the simulation binoculars 200. [

For example, the simulation server 100 may transmit a target image with different viewpoints to the simulation binoculars 200, respectively, when observing the region A shown in Fig. 4 at position 1 and observing at position 2, respectively.

The simulation binoculars 200 display a target image, enlarge it, and allow the user (a trainee) to recognize it. In addition, the simulation binoculars 200 transmit the current position and attitude information to the simulation server 100.

That is, in the present invention, the feature art image projected on the screen 10 is an image that can be visually recognized, and the simulation binoculars 200 can enlarge a certain area of the feature art image and display it to a user.

In particular, the simulated binoculars 200 and the simulation server 100 perform real-time communication to generate and display a target image. At this time, the simulation server 100 and the simulation binoculars 200 use a 1-channel HDMI image transmission method.

The simulation binoculars 200 are equipped with a sensing means for measuring the current position and posture, provided with a display means capable of displaying a target image, and can be provided with a lens for enlarging the target image. A specific description of the simulation binoculars 200 will be described with reference to FIGS. 2 to 4. FIG.

FIG. 2 is an exploded perspective view showing the simulation binoculars shown in FIG. 1, and FIG. 3 is a cross-sectional view showing a cross section of the simulation binoculars shown in FIG.

2 and 3, the simulation binoculars 200 include a body case 210, a display unit 240, a lens unit 250, a latch 220, a signal processing unit 270, a button 280, And a sensor unit 230.

Specifically, the body case 210 is an outer shape of the simulation binoculars 200 equipped with an eyepiece 260 that is in close contact with the user's eyes, and is formed to have the same shape and weight as those of the actually used military binoculars. The body case 210 is provided with a space in which the display unit 240, the lens unit 250, and the sensor unit 230 are installed.

The lens unit 250 enlarges the target image displayed on the display unit 240 and allows the user to recognize the target image. To this end, the relay unit 250 and the eyepiece lens assembly 257 are disposed in series in the lens unit 250. At this time, a pair of lens units 250 are provided in each of the left and right eyes.

The relay lens assembly 255 is installed close to the display unit 240. The relay lens assembly 255 may be a combination of a convex lens, a concave lens, a telecentric lens, an achromatic doublet lens, or the like.

The eyepiece assembly 257 is formed between the eyepiece 260 and the relay lens assembly 255. The eyepiece assembly 257 has a plurality of lenses arranged so that an image through the relay lens assembly 255 adjusts the distance.

The ocular lens assembly 257 may be formed by combining convex lenses, flat convex lenses, concave lenses, and the like. For example, the eyepiece assembly 257 may be a Ramsten eyepiece consisting of two flat convex lenses, or may be a Kellner eyepiece, a compensating eyepiece, or the like.

The signal processing unit 270 provides the image provided by the simulation server 100 to the display unit 240 and provides the current position and orientation information measured by the sensor unit 230 to the simulation server 100. The signal processing unit 270 includes a communication module and can transmit signals to the simulation server 100 through the TCP / IP communication method.

The display unit 240 is provided on the opposite side of the eyepiece unit 260. The display unit 240 is mounted inside the body case 210 and can be protected from an external impact. The display unit 240 may be provided with two display elements 242 and 244 corresponding to the left and right eyes. The display unit 240 may be a display device such as an LCD or OLED to display the target image provided by the simulation server 100. At this time, since the display unit 240 is disposed at the end of the lens unit 250, it is preferable that the display unit 240 is formed in a small size to allow the user to view the image on the lens unit 250.

At least one of the two display elements 242 and 244 of the display unit 240 can be shifted from the center. For example, the first display element 242 provided on the left eye side is disposed in the center of the left eye lens unit 250, and the second display element 244 provided on the right eye side is disposed on the right eye side. And may be shifted to the right or left in the center of the unit 250. Accordingly, the target image transmitted to the display unit 240 can be recognized in three dimensions.

At this time, the second display element 244 is positioned in the center of the lens unit 250 on the right eye side, and the first display element 242 is shifted to the left or right side in the center of the left lens unit 250 It is possible.

According to the embodiment of the present invention, in order to allow the target image to be recognized as three-dimensional by the user, it is possible to provide a shifted target image to any one of the first and second display elements 242 and 244 have. For example, the first display element 242 is provided with a target image, and the second display element 244 is provided with a shifted target image. Thus, the target image can be recognized in three dimensions. In contrast to the above description, only the first display element 242 is provided with a shifted target image so that the target image can be recognized in three dimensions.

Here, the first and second display devices 242 and 244 can receive the same target image from the simulation server 100 through a single-channel HDMI.

The sensor unit 230 may provide the location of the simulation binoculars 200 to the simulation server 100. The sensor unit 230 recognizes at least four landmarks on the screen 10 and transmits the current position and attitude information to the simulation server 100.

To this end, the sensor unit 230 may include at least one of a gyro sensor, an acceleration sensor, and a geomagnetic sensor.

Hereinafter, the operation of the sensor unit 230 will be described with reference to FIGS. 4 and 5. FIG.

4 and 5 are views for explaining the measurement of the current position and attitude information in the sensor unit.

4 and 5, the sensor unit 230 recognizes the four markings provided on the screen 10 to measure the current position and attitude information of the simulation binoculars 200. As shown in FIG. At this time, the four markers form a rectangle.

First, when the first mark 20a is depressed and the button 280 provided in the simulation binoculars 200 is pressed, it is recognized as the first coordinate (0, 0). Thereafter, when the second mark 20b is clamped and the button 280 is depressed, the movement distance between the first mark 20a and the second mark 20b is calculated and set to the second coordinate (0, x) . The third landmark 20c is also recognized as the third coordinate y (0) by calculating the travel distance from the second landmark 20b. The fourth landmark 20d is also calculated by calculating the travel distance from the third landmark 20c And recognizes it as the fourth coordinate (y, x).

At this time, the coordinate recognition can use the angle and the acceleration in the gyro sensor or the acceleration sensor. That is, the distance between the two markers 20a and 20b is calculated by measuring the acceleration moving between the first marker 20a and the second marker 20b, and the distance between the two marks 20a and 20b It is recognized as plane coordinates. For example, the coordinates of the first landmark 20a are set to (0,0), and the coordinates of the second landmark 20b are set to (0, x).

Then, the distance between the two marks is calculated by measuring the acceleration moving between the second mark 20b and the third mark 20c, and then the coordinates of the third mark 20c are set to (x, y) . Next, the acceleration between the third marker 20c and the fourth marker 20d is measured to calculate the distance between the two marks, and then the coordinates of the fourth marker 20d are set to (y, 0) .

At this time, a first angle? Moving from the first marking 20a to the second marking 20b and a second angle? Moving from the second marking 20b to the third marking 20c are measured The posture information can be set.

The four coordinate information and the first and second angles are transmitted to the simulation server 100.

The simulation server 100 sets the region of the target image through the received four coordinate information. Also, the simulation server 100 sets attitude information through the first and second angles? And?.

5, the simulation server 100 receives the coordinate information of the area 1 and the first and second angles? 1 and? 1 in the simulation binoculars 200a at position 1, ).

The simulation server 100 receives the coordinate information of the first area and the first and second angles? 2 and? 2 in the simulation binoculars 200b at the position 2 to thereby obtain the position of the simulation binoculars 200 at the position 2 Backward.

In this case, it is preferable that the target images of the position 1 and the position 2 are the same but the positions of the target image of the position 1 and the position 2 are generated differently.

That is, the simulation server 100 provides the target image set with the obstacle or facade front in the direction observed by the simulation binoculars 200a at position 1, and the side of the obstacle or facility with the simulation binoculars 200b at the position 2 And provides the set target image. Accordingly, the simulation server 100 can generate the target image differently according to the position of the simulation binoculars 200, and transmit the target image so that the training is perceived as a real situation.

That is, since the shell coordinates for the same object are different for each position of the simulation binoculars, it is possible to improve the accuracy of the shell training by providing such an environment.

The sensor unit 230 recognizes the four markers provided on the screen 10 to measure the current position and attitude information of the simulated binoculars 200. The sensor unit 230 displays four markers located at the corners of the screen 10, Can be recognized.

In this case, the present position and attitude information of the binoculars 200 can be measured by the above-described mechanism, and the edge of the screen 10 can be recognized by the simulation server 100.

In this case, the simulation server 100 can give an alarm to the binoculars 200 when it is determined that the binoculars 200 are moving in the direction away from the screen 10 through the attitude information of the binoculars 200, The binoculars 200 can be adjusted so as to face the screen 10.

While the present invention has been described in connection with what is presently considered to be practical exemplary embodiments, it is to be understood that the invention is not limited to the disclosed embodiments, but, on the contrary, It will be understood. The embodiments described above are therefore to be considered in all respects as illustrative and not restrictive.

10: Screen
20: Marker
50: Image projection device
100: Simulation server
200: Simulation binoculars
210: Body case
220: Cleat
230:
240:
242: first display element
244: Second display element
250:
255: Relay lens assembly
257: eyepiece assembly
260: eyepiece
270: Signal processor
280: Button

Claims (10)

In a simulation system for an artillery training,
A screen on which the terrain object image is projected;
An image projecting device for projecting the feature object image onto the screen;
A simulation server for storing the at least one terrain object image, transmitting the selected terrain object image to the image projection device, and generating a target image, which is a predetermined area of the terrain image image;
The markings included in the screen or the topographic image and
And a simulated binoculars for recognizing the landmarks and calculating current position and attitude information, transmitting the calculated current position and attitude information to the simulation server, and magnifying and displaying the target image,
The simulation server includes:
Receiving the current position and attitude information transmitted from the simulation binoculars and generating the target image recognized at the position of the simulation binoculars and the topographic object image at the position, and transmitting the generated target image to the simulation binoculars ,
In the simulation binoculars,
A body case having an eyepiece section adjacent to a left eye and a right eye of a user, respectively;
A display unit that is inserted into the body case and displays the target image;
A lens unit that enlarges a target image displayed on the display unit and is installed so that the user can identify the target image;
An eraser for recognizing the mark on the outside of the body case end;
A button formed on the outside of the main body case so as to be pressed by a user when the mark and the mark match; And
And a sensor unit for sensing the button operation and generating the current position and attitude information.
delete The method according to claim 1,
Wherein the sensor unit includes at least one of a gyro sensor, an acceleration sensor, and a geomagnetic sensor,
Wherein the sensor unit recognizes four squares forming a quadrangle to generate plane coordinates, and calculates an angle between the plane coordinates and transmits the calculated angles to the simulation server.
The method of claim 3,
The simulation server
The position of the target image is set on the plane coordinates, the attitude information is calculated through the angle, the position of the simulation binoculars is inversed to generate a corresponding target image, and the generated target image is set in the simulation binoculars Wherein the simulator system is provided with a plurality of simulators.
The method according to claim 1,
The lens unit
A relay lens assembly for arranging a plurality of lenses adjacent to the display unit and enlarging an image of the display unit; And
And an eyepiece assembly installed between the eyepiece unit and the relay lens assembly and having a plurality of lenses arranged to adjust a distance of an image through the relay lens assembly.
The method according to claim 1,
Wherein the display unit is an LCD or an OLED.
The method according to claim 1,
Wherein the display unit is provided with two display elements for displaying an image on each of the left and right eyes of the user.
8. The method of claim 7,
Wherein at least one of the two display elements shifts and displays an image transmitted on the basis of the left or right eye.
8. The method of claim 7,
Wherein at least one of the two display elements is shifted with respect to the left or right eye.
8. The method of claim 7,
Wherein the two display elements receive the same image from the simulation server via a single channel HDMI.
KR1020150146409A 2015-10-21 2015-10-21 Simulation system for artillery training KR101781471B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150146409A KR101781471B1 (en) 2015-10-21 2015-10-21 Simulation system for artillery training

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150146409A KR101781471B1 (en) 2015-10-21 2015-10-21 Simulation system for artillery training

Publications (2)

Publication Number Publication Date
KR20170046288A KR20170046288A (en) 2017-05-02
KR101781471B1 true KR101781471B1 (en) 2017-09-25

Family

ID=58742747

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150146409A KR101781471B1 (en) 2015-10-21 2015-10-21 Simulation system for artillery training

Country Status (1)

Country Link
KR (1) KR101781471B1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101877215B1 (en) * 2017-11-10 2018-07-12 엘아이지넥스원 주식회사 Method and apparatus for manufacturing simulation image reflecting noise induced image sensor

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114327066A (en) * 2021-12-30 2022-04-12 上海曼恒数字技术股份有限公司 Three-dimensional display method, device and equipment of virtual reality screen and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101877215B1 (en) * 2017-11-10 2018-07-12 엘아이지넥스원 주식회사 Method and apparatus for manufacturing simulation image reflecting noise induced image sensor

Also Published As

Publication number Publication date
KR20170046288A (en) 2017-05-02

Similar Documents

Publication Publication Date Title
US20220366598A1 (en) Calibration system and method to align a 3d virtual scene and a 3d real world for a stereoscopic head-mounted display
US11423586B2 (en) Augmented reality vision system for tracking and geolocating objects of interest
US9892563B2 (en) System and method for generating a mixed reality environment
KR102060453B1 (en) Image display system, control method of image display system, image transmission system and head mounted display
US10071306B2 (en) System and method for determining orientation using tracking cameras and inertial measurements
US20160292924A1 (en) System and method for augmented reality and virtual reality applications
US9132342B2 (en) Dynamic environment and location based augmented reality (AR) systems
ES2656868T3 (en) Portable device, virtual reality system and method
US10030931B1 (en) Head mounted display-based training tool
CN108479062B (en) Game event trigger control method, device and computer readable storage medium
CN112525185B (en) AR navigation method based on positioning and AR head-mounted display device
US11269400B2 (en) Positioning system
CA2765668A1 (en) Method and arrangement of a flight simulator system
KR101781471B1 (en) Simulation system for artillery training
KR20160090042A (en) Arcade game system by 3D HMD
KR101696243B1 (en) Virtual reality system and method for realizing virtual reality therof
US11902499B2 (en) Simulation sighting binoculars, and simulation system and methods
EP3132279B1 (en) A target determining method and system
JP7018443B2 (en) A method to assist in target localization and a viable observation device for this method
WO2011075061A1 (en) Device for measuring distance to real and virtual objects
Brookshire et al. Military vehicle training with augmented reality
KR20180118473A (en) Complex laser distance measuring apparatus using gps cordinate information using gps cordinate information and golf course image
KR20220067061A (en) Support device recognizing absolute position of hololens
JP6896670B2 (en) Radiation dose distribution display device and radiation dose distribution display method
SE0950962A1 (en) Distance measurement device for real and virtual objects

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
GRNT Written decision to grant