KR101657646B1 - Method for displaying label in augmented reality service - Google Patents

Method for displaying label in augmented reality service Download PDF

Info

Publication number
KR101657646B1
KR101657646B1 KR1020160017638A KR20160017638A KR101657646B1 KR 101657646 B1 KR101657646 B1 KR 101657646B1 KR 1020160017638 A KR1020160017638 A KR 1020160017638A KR 20160017638 A KR20160017638 A KR 20160017638A KR 101657646 B1 KR101657646 B1 KR 101657646B1
Authority
KR
South Korea
Prior art keywords
label
augmented reality
determining
information
ship
Prior art date
Application number
KR1020160017638A
Other languages
Korean (ko)
Inventor
오재용
박세길
김선영
Original Assignee
한국해양과학기술원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한국해양과학기술원 filed Critical 한국해양과학기술원
Priority to KR1020160017638A priority Critical patent/KR101657646B1/en
Application granted granted Critical
Publication of KR101657646B1 publication Critical patent/KR101657646B1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B63SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
    • B63BSHIPS OR OTHER WATERBORNE VESSELS; EQUIPMENT FOR SHIPPING 
    • B63B49/00Arrangements of nautical instruments or navigational aids
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B63SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
    • B63BSHIPS OR OTHER WATERBORNE VESSELS; EQUIPMENT FOR SHIPPING 
    • B63B2213/00Navigational aids and use thereof, not otherwise provided for in this class
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Mechanical Engineering (AREA)
  • Ocean & Marine Engineering (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The present invention relates to a method for displaying a label to output augmented reality navigation information, capable of improving legibility on text information outputted from an augmented reality navigation support system. The method for displaying a label includes: a first step of extracting a check to a position corresponding to each object, in an image in which multiple objects are included, wherein the image is input by a camera installed in a vessel in real time; a second step of determining the label on the extracted object based on navigation information input from the outside; a third step of determining an initial position of the label and the size of the label based on the check; a fourth step of confirming an overlap between the labels and the distance of the object on the label and determining a final position of the label; and a fifth step of displaying the label in which the final position is determined on an augmented reality screen outputting the image and the navigation information of the vessel in augmented reality.

Description

Field of the Invention < RTI ID = 0.0 > [0001] < / RTI &

Field of the Invention [0002] The present invention relates to a label display method for an augmented reality screen, and more particularly, to a label display method for augmented reality navigation information output capable of improving readability of text information output from an augmented reality navigation support system .

Recently, the risk of marine accidents is increasing due to the increase of maritime traffic and the activation of marine leisure. These marine accidents are caused by accidents due to operational errors, and most of them are caused by collision accidents due to violation of maritime regulations and navigation regulations.

Therefore, as the damage of marine accidents increases, interest in safe navigation of vessels is increasing, and development of technologies for preventing marine accidents and preventing accidents is actively under way.

In order to secure the stability of the ship, advanced IT technology applied system such as GNSS (Global Navigation Satellite System), AIS (Automatic Identification System), ECDIS (Electronic Chart Display and Information System) However, on the other hand, there is a problem that the complex system may interfere with the rapid decision making of the navigator.

As a result, there is a need to process and integrate navigation information so that the navigator can quickly and accurately make navigation decisions. To this end, augmented reality (AR) technology capable of providing intuitive information to the navigator Has been introduced.

Augmented reality is a technique derived from the field of virtual reality, which means a technique of superimposing a three-dimensional virtual image on a real image or background and displaying it as a single image.

Virtual reality has a characteristic that it can interact with users by creating all environments into a virtual environment through a computer, but augmented reality can provide enhanced reality by interacting with virtual objects based on the real world.

Regarding Augmented Reality Technology, Registration No. 10-1493614 discloses a ship operation simulator using augmented reality technology based virtual bridge system for safe operation of a ship and a technique for implementing a ship operation simulation using the same.

However, in the conventional augmented reality system, various information needs to be output on a limited screen, so that information output on the screen may overlap and be displayed as shown in Fig.

The augmented reality system for voyage support can provide information about the ship around the charity, navigation sign information, and electronic chart information. In the case of entrance or departure, there are many ships around the charity, so the amount of information can be increased accordingly.

In particular, as the amount of information to be output increases, the text may be overlapped or the lines may be overlapped to deteriorate readability.

In addition, when information on the other line is displayed, it is difficult to clearly display the information because it is concentrated on the vicinity of the horizontal line or on the water surface, and there is a disadvantage that the target object can be misrecognized as the position of the information output is ambiguous.

Registration No. 10-1493614 (registered on February 09, 2015)

SUMMARY OF THE INVENTION The present invention has been conceived to solve the problems as described above, and it is an object of the present invention to provide an improved augmented reality navigation information providing system, which can prevent overlap of augmented reality information and improve readability, And to provide a display method.

It is another object of the present invention to provide a label display method for outputting the augmented reality navigation information that can effectively provide navigation information by outputting label information differently according to importance of information.

However, the object of the present invention is not limited to the above-mentioned objects, and other objects not mentioned can be clearly understood by those skilled in the art from the following description.

According to another aspect of the present invention, there is provided a label display method for outputting augmented reality navigation information according to the present invention, comprising the steps of: A first step of extracting a position of the label, a first step of extracting a position of the label, a second step of determining a label indicating the extracted tag based on externally inputted flight information, A fourth step of determining a final position of the label by confirming whether the distance between the label and the label is overlapped with the label and whether the navigation information of the ship and the image are output to the augmented reality And displaying the label on which the final position is determined on the augmented reality screen.

In addition, the label display method for outputting the augmented reality navigation information of the present invention is characterized in that the position of the target table includes the position of the target, the navigation mark, and the landmark around the charity ship.

In the labeling method for outputting the augmented reality navigation information of the present invention, the label includes text information, and the size of the label is determined according to the importance set in the flight information.

The label display method for outputting the augmented reality navigation information according to the present invention is characterized in that in the third step, the initial position of the label is determined based on the position of the screen of the object in the left / upper, left / And / or the lower stage.

According to another aspect of the present invention, there is provided a label display method for outputting augmented reality navigation information, comprising the steps of: determining a search direction for a position movement at an initial position of the label in the fourth step; Determining a final position of the label by judging that the sum of the difference of all labels moved in the position and the process of setting the position movement at a distance closest to the index position in the state is converged if the summed value is within a set error range, And a control unit.

Here, the distance between the labels indicates the length of a line existing between the label and the label among the lines connecting the midpoints of the two labels.

The label display method for outputting the augmented reality navigation information according to the present invention is characterized in that, in the fifth step, the label displayed on the augmented reality screen includes at least one of thickness, flicker, Is selected and displayed, and the importance of the label includes a risk of collision between the ship and the object being sailed.

According to the label display method for outputting the augmented reality navigation information according to the present invention, it is possible to effectively output the additional information displayed on the screen by preventing the augmenting of the augmented reality information to improve readability.

According to the present invention, the navigation information can be effectively provided to the navigation company by outputting the label information differently according to the importance of the navigation information of the ship.

1 is an exemplary diagram illustrating a label displayed on an augmented reality screen according to the prior art.
2 is a flowchart illustrating a label display method for outputting the augmented reality navigation information according to the present invention.
3 is a block diagram schematically showing the configuration of a label display system for an augmented reality screen for outputting the augmented reality navigation information according to the present invention.
Fig. 4 is an exemplary diagram showing an example of an initial position of a label according to the position of augmented reality screen of a table; Fig.
Fig. 5 is an exemplary diagram illustrating the determination of the distance between labels. Fig.
FIG. 6 is an exemplary diagram illustrating a label displayed on an augmented reality screen according to an embodiment of the present invention.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, a detailed description of preferred embodiments of the present invention will be given with reference to the accompanying drawings. In the following description of the present invention, detailed description of known functions and configurations incorporated herein will be omitted when it may make the subject matter of the present invention rather unclear.

The embodiments according to the concept of the present invention can be variously modified and can take various forms,

The specification or application. It is to be understood, however, that it is not intended to limit the embodiments according to the concepts of the present invention to the particular forms of disclosure, but includes all modifications, equivalents, and alternatives falling within the spirit and scope of the invention.

It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between. Other expressions that describe the relationship between components, such as "between" and "between" or "neighboring to" and "directly adjacent to" should be interpreted as well.

The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. The singular expressions include plural expressions unless the context clearly dictates otherwise. In this specification, the terms "comprises ", or" having ", or the like, specify that there is a stated feature, number, step, operation, , Steps, operations, components, parts, or combinations thereof, as a matter of principle.

FIG. 2 is a flowchart showing a label display method for outputting the augmented reality navigation information according to the present invention, and FIG. 3 is a block diagram schematically showing the configuration of a label display system implementing a label display method for outputting the augmented reality navigation information .

Referring to FIG. 1, a label display system 10 of an augmented reality screen capable of implementing a label display method for outputting augmented reality navigation information according to the present invention includes a tag position confirmation unit 100, an initial position determination unit 110, A label position confirmation unit 130, a final position determination unit 140, and a label output unit 150. The label position determination unit 120 may include a label position determination unit 120, a label position determination unit 130,

The label display system 10 for implementing the label display method of the present invention may be configured as described above, but may also be configured as a program executable by a computer.

Referring to the configuration of the label display system 10 described above, a label display method for outputting navigation information will be described as follows.

First, the spot position confirmation unit 100 checks the position of the object corresponding to each object in real-time input from a plurality of cameras (not shown) arranged on the ship and including a plurality of objects, for example, The position of the object is extracted (S101).

At this time, the spot position confirmation unit 100 can extract the position of the object in the projection using various existing image processing methods. In addition, the position of the target table may include the position of the target line, the navigation mark, the feature, and the like around the charity.

Then, the initial positioning unit 110 displays a label, that is, a tag for the extracted position of the object on the basis of the operation information input from the outside (for example, a maritime traffic control center or an automatic ship identification device) (S102). The label may include textual information so that it can recognize information about each object.

In addition, the initial position determining unit 110 determines the initial position of the label and the label size based on the position of the object table (S103). The size of the label can be determined according to the importance set in the flight information.

4, the initial position of the label is located at the left / upper end L1, the right / upper end L2, the left / lower end L3, and the right / bottom end L4 with reference to the position of the target table Can be arbitrarily set.

The search direction determination unit 120 determines the search direction for the position movement in the initial position of the label determined by the initial position determination unit 110, for example, eight directions (set at intervals of 45 degrees) at the initial position of the label (S104).

In setting the search direction for the movement of the location, the search direction determination unit 120 can set the search direction according to the existence of the other label, whether or not there is an intersection with another map, and the position of the current map on the screen.

The label position confirmation unit 130 sets the label position movement with respect to the search direction set in the search direction determination unit S104 (S105). At this time, the label position confirmation unit 130 can set the position movement to a position closest to the position of the object in a state in which the distance between the labels is not less than the set range, without overlapping with different labels.

Here, the distance between the labels indicates the length d of the line existing between the label and the label among the lines connecting the midpoints of the two labels (Label 1 and Label 2) as shown in Fig.

In step S107, the final position determination unit 130 determines whether or not convergence is obtained by summing up the change differences of all the position-shifted labels. If the sum is within the predetermined error range, (S108).

In the present invention, it is possible to use a method of gradually checking the overlapping position at an initial position of the label to determine the position of the final label without overlap.

After finding the position where all the labels do not overlap through the iterative loop, there is no need to change the position anymore. To judge the end of the loop, the position change difference is calculated and the convergence of the summation value is used.

For example, when the search is completed, the difference in the position change can be zero, but a state converging to any small value can be set as the search completion condition.

As shown in Fig. 6, the label output unit 150 outputs the label on which the final position has been determined to the augmented reality screen for outputting the augmented reality image to the augmented reality (S109). In this case, the label output unit 15 can select at least one of attributes of the corresponding label, that is, different thicknesses, flicker levels, and other colors according to the importance of the label, and output it to the augmented reality screen.

Here, the importance of the label may include information such as the risk of collision between the ship and the object being sailed.

As described above, the label display method according to the present invention is characterized in that overlapping of the augmented reality information is minimized and additional information can be provided to the navigator more effectively.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed embodiments, but, on the contrary, is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims. will be. Accordingly, the true scope of the present invention should be determined by the technical idea of the appended claims.

10: Labeling system
100: a table position confirmation unit 110: an initial position determination unit
120: search direction determination unit 130: label position determination unit
140: final positioning unit 150: label output unit

Claims (8)

A first step of extracting a position of a target to a position corresponding to a plurality of objects included in the image in an image input in real time from a camera disposed on the ship;
A second step of determining, based on the externally inputted flight information, a label indicating the timetable;
A third step of determining an initial position and a label size of the label based on the position of the target table;
A fourth step of determining a final position of the label by confirming a distance between the label and the label and whether the label overlaps the label; And
And displaying the navigation information of the ship and the label whose final position is determined on the augmented reality screen for outputting the image to the augmented reality,
In the fourth step,
Determining a search direction for position movement at an initial position of the label;
Setting a position movement at a distance closest to the index position in a state where a distance between different labels is equal to or greater than a set range; And
And determining a final position of the label by determining that the convergence is converged if the summed value is within a set tolerance range, and determining a final position of the label if the summed value is within a set tolerance range Way.
(Where the distance between labels indicates the length of the line between the label and the label among the lines connecting the midpoints of the two labels)
The method according to claim 1,
Wherein the position of the target table includes a position of a line around the charter (ship), a route mark, and a position of the feature item.
The method according to claim 1,
In the second step,
Wherein the label includes text information. ≪ Desc / Clms Page number 20 >
The method according to claim 1,
In the third step,
Wherein the size of the label is determined according to a degree of importance set in the flight information.
The method according to claim 1,
In the third step,
Wherein the initial position of the label is set to one of left / top, left / bottom, right / top, and right / bottom based on the screen position of the target table.
delete The method according to claim 1,
In the fifth step,
Wherein the label displayed on the augmented reality screen is displayed by selecting at least one of thickness, flicker, and other colors according to importance of the label.
8. The method of claim 7,
Wherein the importance of the label includes a risk of collision between the ship and the object being sailed.
KR1020160017638A 2016-02-16 2016-02-16 Method for displaying label in augmented reality service KR101657646B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020160017638A KR101657646B1 (en) 2016-02-16 2016-02-16 Method for displaying label in augmented reality service

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020160017638A KR101657646B1 (en) 2016-02-16 2016-02-16 Method for displaying label in augmented reality service

Publications (1)

Publication Number Publication Date
KR101657646B1 true KR101657646B1 (en) 2016-09-22

Family

ID=57102758

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020160017638A KR101657646B1 (en) 2016-02-16 2016-02-16 Method for displaying label in augmented reality service

Country Status (1)

Country Link
KR (1) KR101657646B1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190059083A (en) * 2017-11-22 2019-05-30 한국전자통신연구원 Apparatus and method for recognition marine situation based image division
KR20230040487A (en) * 2021-09-16 2023-03-23 삼성중공업 주식회사 Vessel having object recognition apparatus

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101072397B1 (en) * 2011-07-21 2011-10-11 한국해양연구원 A augmented reality system for vessel using monitor and the method using thereof
KR20120032336A (en) * 2010-09-28 2012-04-05 엘지전자 주식회사 Method for displaying information of augmented reality and terminal thereof
KR101493614B1 (en) 2013-11-01 2015-02-13 (주)세이프텍리서치 Ship Navigation Simulator and Design Method by using Augmented Reality Technology and Virtual Bridge System

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120032336A (en) * 2010-09-28 2012-04-05 엘지전자 주식회사 Method for displaying information of augmented reality and terminal thereof
KR101072397B1 (en) * 2011-07-21 2011-10-11 한국해양연구원 A augmented reality system for vessel using monitor and the method using thereof
KR101493614B1 (en) 2013-11-01 2015-02-13 (주)세이프텍리서치 Ship Navigation Simulator and Design Method by using Augmented Reality Technology and Virtual Bridge System

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190059083A (en) * 2017-11-22 2019-05-30 한국전자통신연구원 Apparatus and method for recognition marine situation based image division
KR102069694B1 (en) 2017-11-22 2020-01-23 한국전자통신연구원 Apparatus and method for recognition marine situation based image division
KR20230040487A (en) * 2021-09-16 2023-03-23 삼성중공업 주식회사 Vessel having object recognition apparatus
KR102596268B1 (en) 2021-09-16 2023-10-30 삼성중공업 주식회사 Vessel having object recognition apparatus

Similar Documents

Publication Publication Date Title
KR101647743B1 (en) Navigation system of ships for avoiding collision using time series graphic interface
KR101941521B1 (en) System and method for automatic tracking of marine objects
US20210350710A1 (en) Ship movement-sharing navigation assistance system
US8896685B2 (en) Method and system for determining information relating to vacant spaces of a parking lot
Jaeyong et al. Advanced navigation aids system based on augmented reality
CA2756912C (en) Methods and systems for augmented navigation
CN107430815B (en) Method and system for displaying parking area
US7647179B2 (en) System and method for coordinate mapping onto airport diagrams
JP4327000B2 (en) Counterpart movement monitoring device
KR20110116842A (en) Electronic navigational chart display method of vessels navigation system using augmented reality
CN103376109B (en) Air route plan making device and air route scheduler verifying method
EP3125213B1 (en) Onboard aircraft systems and methods to identify moving landing platforms
KR101696615B1 (en) Control system for avoiding collision of multiple ships using time series graphic interface
CN102782739A (en) Driver assistance device having a visual representation of detected object
JP2008287379A (en) Road sign data input system
KR101657646B1 (en) Method for displaying label in augmented reality service
EP1748281B1 (en) Map generating apparatus
Gernez et al. A review of augmented reality applications for ship bridges
CN115723919A (en) Auxiliary navigation method and device for ship yaw
KR20210116799A (en) Control system for ship entry and departure of port using digital twin
KR20230023844A (en) Method For Predicting And Avoiding Ship Collision Possibility Using Digital Twin
US20140300623A1 (en) Navigation system and method for displaying photomap on navigation system
EP4047312A1 (en) Augmented reality based tidal current display device
EP4209826A1 (en) High-confidence optical head pose correspondence mapping with multiple markers for high-integrity headtracking on a headworn display (hwd)
US11808579B2 (en) Augmented reality based tidal current display apparatus and method

Legal Events

Date Code Title Description
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20190625

Year of fee payment: 4