KR101733519B1 - Apparatus and method for 3-dimensional display - Google Patents

Apparatus and method for 3-dimensional display Download PDF

Info

Publication number
KR101733519B1
KR101733519B1 KR1020150134489A KR20150134489A KR101733519B1 KR 101733519 B1 KR101733519 B1 KR 101733519B1 KR 1020150134489 A KR1020150134489 A KR 1020150134489A KR 20150134489 A KR20150134489 A KR 20150134489A KR 101733519 B1 KR101733519 B1 KR 101733519B1
Authority
KR
South Korea
Prior art keywords
user
gaze
information
calibration
calibration point
Prior art date
Application number
KR1020150134489A
Other languages
Korean (ko)
Other versions
KR20170037692A (en
Inventor
석윤찬
Original Assignee
주식회사 비주얼캠프
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 비주얼캠프 filed Critical 주식회사 비주얼캠프
Priority to KR1020150134489A priority Critical patent/KR101733519B1/en
Publication of KR20170037692A publication Critical patent/KR20170037692A/en
Application granted granted Critical
Publication of KR101733519B1 publication Critical patent/KR101733519B1/en

Links

Images

Classifications

    • H04N13/0425
    • H04N13/0402
    • H04N13/0468
    • H04N13/0484

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

A three-dimensional display device is disclosed. A three-dimensional display device according to an embodiment of the present invention includes at least one processor, a computer readable medium having at least one computer program recorded thereon, a display unit, and a gaze tracking unit for obtaining gaze information of a user gazing at a screen of the display unit Wherein the one or more computer programs are configured to be executed by the one or more processors and the program causes the display unit to visually display at least one calibration point having a different distance from the user within the three- Acquiring gaze information of the user gazing at the calibration point through the gaze tracking unit and acquiring gaze information corresponding to the position information of the displayed calibration point and the calibration point And generating a calibration table containing the instructions.

Description

TECHNICAL FIELD [0001] The present invention relates to a three-dimensional display device,

Embodiments of the invention relate to three-dimensional display technology.

Recently, the use of augmented reality equipment such as a head mounted display (HMD) has been increasing as virtual reality technology has been developed. The head mounted display is a display device which can display three-dimensional stereoscopic images directly in front of the user's eyes in the form of being mounted on the head of a user. In addition, a technique of selecting contents or inputting characters or the like by analyzing the direction and position of a user's gaze without a separate input device such as a keyboard or a mouse has been developed in response to this.

In the case of the head-mounted display, the user is configured to directly mount the head-mounted display on the head as described above. At this time, calibration environment is required to accurately measure the direction and position of the user's gaze, because the user's environment or physical characteristics are different for each user. In particular, since the head-mounted display is configured to provide the user with a three-dimensional stereoscopic image, the calibration must also be performed in a three-dimensional space. Accordingly, there has been a need for a calibration technique in a three-dimensional space.

Korean Patent Laid-Open No. 10-2011-0136012 (2011.12.21)

Embodiments of the present invention are intended to provide means for calibration of a three-dimensional display device.

Embodiments of the present invention are also intended to implement a three-dimensional display device and method.

According to an exemplary embodiment of the present invention, there is provided a computer readable medium including at least one processor, a computer readable medium having recorded therein one or more computer programs, a display unit, and a gaze tracking unit for obtaining gaze information of a user gazing at a screen of the display unit, One or more computer programs are configured to be executed by the one or more processors, the program comprising the steps of visually displaying at least one calibration point having a different distance from the user within the three-dimensional virtual space through the display unit, Acquiring gaze information of the user gazing at the calibration point through the gaze tracking unit, calibrating calorie information including the position information of the calibration point displayed and the gaze information corresponding to the calibration point The three-dimensional display device including instructions for executing a method comprising: generating a migration table is provided.

The program may further include a step of acquiring gaze information of a user gazing at a specific object in the three-dimensional virtual space displayed by the display unit after performing the step of generating the calibration table, and a step of gazing the object with reference to the calibration table And calculating position information corresponding to the gaze information of the user.

The step of calculating the positional information may include calculating the positional information based on the linearity based on the linearity when there is no gaze information coinciding with the gaze information of the user gazing at the object among the gaze information included in the calibration table, The position information corresponding to the gaze information of the user gazing at the object can be calculated from the gaze information included in the table.

The gaze information may include at least one of a gaze position of the user and an angle difference in a gaze direction of both eyes.

Wherein the gaze tracking unit includes at least one camera for recognizing the pupil of the user, and the display unit further includes a lens through which the user's gaze passes, wherein the camera is positioned between the lens and the screen, or between the lens and the user And can be disposed between both eyes.

The gaze tracking unit may further include at least one infrared ray generator for emitting infrared rays to both sides of the user.

According to another exemplary embodiment of the present invention, there is provided a method for displaying a three-dimensional virtual space, the method comprising the steps of: visually displaying at least one calibration point having a different distance from a user within a three- The three-dimensional display device includes a step of acquiring gaze information of the user gazing at the calibration point through the gaze tracking unit, and the gaze information corresponding to the position information of the calibration point and the calibration point A computer program stored in a computer-readable recording medium for executing the step of generating a calibration table is provided.

Wherein the three-dimensional display device obtains visual information of a user gazing at a specific object in a three-dimensional virtual space displayed by the display unit after performing the step of generating the calibration table, And calculating position information corresponding to the gaze information of the user gazing at the object with reference to the position information of the user.

The step of calculating the positional information may include the step of, if there is no sight line information coinciding with the sight line information of the user gazing at the object among the sight line information included in the calibration table, The position information corresponding to the gaze information of the user gazing at the object can be calculated from the gaze information.

According to another exemplary embodiment of the present invention, a three-dimensional display device visually displays at least one calibration point having a different distance from a user in a three-dimensional virtual space through a display unit, Acquiring gaze information of the user gazing at the calibration point through a gaze tracking unit and generating a calibration table including position information of the calibration point displayed on the three-dimensional display device and the gaze information corresponding to the calibration point The method comprising the steps of:

Wherein the three-dimensional display device obtains visual information of a user gazing at a specific object in a three-dimensional virtual space displayed by the display unit after performing the step of generating the calibration table, And calculating position information corresponding to the gaze information of the user gazing at the object with reference to the gaze information.

The step of calculating the positional information may include the step of, if there is no sight line information coinciding with the sight line information of the user gazing at the object among the sight line information included in the calibration table, The position information corresponding to the gaze information of the user gazing at the object can be calculated from the gaze information.

According to the embodiments of the present invention, in the calibration process of the three-dimensional display device, the procedure is simplified by utilizing the user's gaze information itself without a separate calculation procedure for calculating the gaze position of the user from the gaze information of the user. In addition, it is possible to prevent errors caused by performing additional operation procedures.

According to another embodiment of the present invention, a camera for acquiring images of both sides of a user is arranged to be spaced apart from a user's eyes by a predetermined distance, thereby obtaining an undistorted image of the user's eyes, It is possible to track the direction of the eye accurately.

1 is a block diagram showing a configuration of a three-dimensional display device according to an embodiment of the present invention.
Figure 2 is an illustration of a calibration point according to one embodiment of the present invention.
3 is an exemplary diagram illustrating a three-dimensional display device according to an embodiment of the present invention.
4 is a flowchart illustrating a three-dimensional display method according to an embodiment of the present invention.

Hereinafter, specific embodiments of the present invention will be described with reference to the drawings. The following detailed description is provided to provide a comprehensive understanding of the methods, apparatus, and / or systems described herein. However, this is merely an example and the present invention is not limited thereto.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings. In the following description, well-known functions or constructions are not described in detail since they would obscure the invention in unnecessary detail. The following terms are defined in consideration of the functions of the present invention, and may be changed according to the intention or custom of the user, the operator, and the like. Therefore, the definition should be based on the contents throughout this specification. The terms used in the detailed description are intended only to describe embodiments of the invention and should in no way be limiting. Unless specifically stated otherwise, the singular form of a term includes plural forms of meaning. In this description, the expressions "comprising" or "comprising" are intended to indicate certain features, numbers, steps, operations, elements, parts or combinations thereof, Should not be construed to preclude the presence or possibility of other features, numbers, steps, operations, elements, portions or combinations thereof.

1 is a block diagram showing a configuration of a three-dimensional display device 100 according to an embodiment of the present invention. The three-dimensional display device 100 according to an embodiment of the present invention refers to a device for displaying contents on a display screen three-dimensionally according to a user's input or setting. For example, the 3D display device 100 may be a smart TV, a smart phone, a PDA, a personal computer, a laptop computer, a virtual reality device configured to be worn on a user's head, a Smart Glass, Head Up Display, HUD).

1, a three-dimensional display device 100 according to an embodiment of the present invention includes a display unit 110, a gaze tracking unit 120, a computer readable medium 130, a processor 140, And a memory 150.

The display unit 110 displays the content on at least one screen using the image display means. At this time, the image display means is for displaying various types of contents visually, and may include, for example, an LCD panel, an OLED panel, a PDP, a transparent display and the like.

The content may be various types of data such as a moving image, an image, a game, a web page, a virtual reality, and an augmented reality which can be visually displayed on the screen using an image display means. In one embodiment, the content may be an image or a video having depth information, such as a three-dimensional image or a multi-depth image. The display unit 110 may display the image or moving image as being present in a three-dimensional virtual space. The depth information (or distance information) refers to a distance or perspective of the user who is looking at the content displayed on the screen, that is, a distance between a virtual object displayed on the screen and a user viewing the object.

In one embodiment, the display unit 110 may represent a distance or a perspective of a three-dimensional image or an image using a binocular parallax of the user. For example, the display unit 110 may provide the respective images so that there is a difference between the images provided to the right and left eyes of the user, thereby allowing the user to feel the depth feeling.

Meanwhile, according to the embodiment, the display unit 110 may further include a lens for focusing the user's two eyes on the displayed content. Specifically, since a device such as a head-mounted display is worn on the user's head, the display screen is located close to the user's eyes. At this time, since the lens is disposed between the user's eyes and the display screen, the focus of the display screen can be accurately formed in both eyes of the user.

By the operation of the processor 140 to be described later, the display unit 110 can provide the user with at least one calibration point existing in the three-dimensional virtual space through the screen. At this time, the processor 140 may acquire the position information of the calibration point that the user has examined. At this time, the position information of the calibration point includes distance information. In the present description, the calibration point means a point on a three-dimensional virtual space displayed on a display screen to perform calibration for adapting the three-dimensional display device 100 to a user.

The gaze tracking unit 120 can obtain the gaze information to be gazed by the user. The line-of-sight information includes a line-of-sight position of the user, an angle difference And the like. The user's gaze position may include not only two-dimensional position information of a point where the user looks on the screen, but also three-dimensional position information in the virtual space. The angular difference of the line of sight of the user in both directions changes according to the focal length of the user. In general, when a person gazes at an object at a close distance, that is, when the focal length is short, the pupils will be gathered in the center, and the angular difference in the viewing direction of both eyes will become large.

The eye tracking unit 120 according to an embodiment of the present invention may include a camera for acquiring images of both eyes of a user. Specifically, the gaze tracking unit 120 may track a user's gaze position through real-time analysis, such as obtaining a pupil image of both eyes of a user using a camera and sensing the center position of the pupil from the obtained image, It should be noted that the embodiments of the present invention are not limited to specific line-of-sight methods or algorithms. For example, the gaze position can be calculated based on the fixed position reflected on the cornea of the user. In addition, reflected light from a mirror built-in contact lens or a magnetic field of a coil-integrated contact lens can be used, and an electric field corresponding to the movement of the eye can be used by attaching a sensor around the eye. When an electric field is used, eye movement can be detected even when the eyes are closed (sleeping, etc.).

According to one embodiment of the present invention, the camera may be disposed between the lens of the display unit 110 and the screen. At this time, the display unit 110 is configured to allow the user's eyes to pass through the lens to reach the screen. Thus, the camera may be located farther away from the user than the lens. That is, since the camera is arranged to be separated from the user's eyes by a certain distance or more and the camera is arranged to be adjacent to a virtual line that the user's line of sight is proceeding, the image in both directions can be obtained without distortion. For example, the shape of the pupil obtained by photographing from the user's side will be displayed as an ellipse. In this case, it is difficult to track the direction of the user's accurate gaze. On the contrary, when the sensor camera is positioned on the user's gaze direction, the shape of the obtained pupil will be close to a circle. Therefore, the gaze tracking unit 120 can acquire a full-binocular image of the user and more accurately track the direction of the user's gaze. However, the position of the camera is not limited between the lens and the screen, but may be disposed between the lens and both eyes of the user.

According to an embodiment of the present invention, in the calibration process, the processor 140 may display a plurality of calibration points having different distance information through the display unit, and when the user stare at the calibration point, ) Can acquire gaze information of both eyes of the user for each calibration point to be examined by the user. At this time, the gaze information may include the gaze position of the user, the angle difference between the gaze direction of the user's eyes, and the like, as described above. According to an embodiment of the present invention, in the calibration process, the 3D display device 100 acquires only the visual information of the user, maps the acquired visual information and the position information of the corresponding calibration point, Can be generated. Therefore, during the calibration process. The procedure is simplified by utilizing the user's gaze information itself without a separate calculation procedure for calculating the user's gaze position from the user's gaze information. In addition, it is possible to prevent errors that may occur as a result of performing additional operation procedures.

The gaze tracking unit 120 may further include an infrared ray generator. The infrared ray generator can emit infrared rays to both sides of the user. Accordingly, the infrared ray generator can cause the camera to acquire a clear shape of the user's eyes. According to an embodiment of the present invention, the infrared ray generator may be disposed around the lens of the display unit 110 through which the user's line of sight penetrates. In addition, the eye-gaze tracking unit 120 may further include a plurality of infrared-ray generators disposed around the lens so as to prevent an area of the infrared ray from being changed according to a direction of a user's gaze.

The computer-readable medium 130 is an apparatus that records data or programs, etc., executed by a computing device. According to one embodiment of the present invention, the computer-readable medium may include program instructions, local data files, local data structures, etc., alone or in combination. Computer-readable media may be those specially designed and constructed for the present invention, or may be those that are commonly used in the computer software arts. Examples of computer-readable media include magnetic media such as hard disks, floppy disks and magnetic tape, optical recording media such as CD-ROMs and DVDs, and specifically configured to store and execute program instructions such as ROM, RAM, flash memory, Hardware devices. Examples of such programs may include machine language code such as those produced by a compiler, as well as high-level language code that may be executed by a computer using an interpreter or the like. For example, the computer readable medium 130 may include an algorithm and / or a calibration table for the processor 140 to perform the calibration process.

Processor 140 may be, but is not limited to, a central processing unit (CPU), for example, for executing software or controlling hardware using computer-readable medium 130. According to one embodiment of the present invention, the processor 140 may execute at least one program stored in the computer- For example, the processor 140 may visually display at least one calibration point having a different distance from the user in the three-dimensional virtual space through the display unit 110, Acquiring gaze information of a user to be examined, and generating a calibration table including position information of the calibration point and gaze information of the user corresponding to the calibration point. The generated calibration table may also be stored in the computer-readable medium 130 described above.

The calibration table means data obtained by mapping the position information of at least one calibration point separated by a different distance from the user and the sight line information of the user corresponding to each of the calibration points. Accordingly, the calibration table includes at least one positional information and corresponding gaze information of the user. According to an embodiment of the present invention, when the user gazes at the content displayed in the three-dimensional virtual space after the calibration process is performed, the gaze tracking unit 120 causes the processor 140 to determine, Acquires the information, and acquires the position information of the line of sight that the user is looking at with reference to the stored calibration table. If there is no gaze information corresponding to the gaze information obtained by the gaze tracking unit 120 in the calibration table, new position information may be calculated by a combination of gaze information existing in the calibration table.

As an example, new position information can be calculated using the proportional relationship of two or more stored gaze information values. For example, let's take a case where the angular difference of the viewing direction of both eyes of the user stored in the calibration table is 0.1 ° and 0.3 ° by gazing at the calibration point in the calibration process. At this time, if the angle difference between the eyes of the user gazing at the object is 0.2 deg., The data stored in the calibration table can not be used as it is. Therefore, from the gaze information stored in the calibration table, it is possible to calculate the distance information of the point where the user is gazing based on the linearity. That is, the intermediate value (distance information corresponding to 0.2 degrees) of the distance information corresponding to 0.1 degrees and 0.3 degrees on the calibration table is the distance information of the point where the user looks. However, the above-described method of calculating the distance information is an example, and the method of calculating the distance information of the point where the user is gazing is not limited thereto.

In addition, the position information on the two-dimensional position of the point at which the user is gazing may be calculated based on the linearity of two or more gaze direction information stored in the calibration table as well as the method of calculating the distance information of the point where the user is gazing, But can also be calculated, for example, by the method of bilinear interpolation.

According to an exemplary embodiment of the present invention, when a plurality of overlapping screen windows are displayed in a three-dimensional virtual space, the area of the screen window disposed at the rear is smaller than that of the screen window disposed at the front It is difficult for the three-dimensional display device 100 to determine which screen the user is staring at. In this case, the three-dimensional display device 100 can easily determine the screen window to be viewed by the user by acquiring the distance information of the user's gaze position.

In one embodiment, the display portion 110, gaze tracking portion 120 may be implemented on a computing device that includes one or more processors 140 and a computer readable medium 130 coupled to the processor 140. The computer readable medium 130 may be internal or external to the processor 140 and may be coupled to the processor 140 in a variety of well known ways. The processor 140 in the computing device may cause each computing device to operate in accordance with the exemplary embodiment described herein. For example, the processor may execute instructions stored on a computer-readable medium, and instructions stored on the computer readable medium may cause the computing device, when executed by the processor 140, May be configured to perform operations.

2 is an exemplary view illustrating calibration points according to an embodiment of the present invention. As shown in FIG. 2, when performing the calibration process, the processor can display the calibration point through the display unit. At this time, a plurality of calibration points having different distances from the user may be displayed to obtain the three-dimensional position information of the user's line of sight. In FIG. 2, three planes separated from the user by a short distance, a medium distance, and a long distance, and nine calibration points are shown for each plane. In this case, it is possible to map the line of sight information of the user who takes the calibration point for each calibration point. The mapped location information and sight line information can constitute a calibration table. When the user gazes at an object in the virtual space displayed on the screen after the calibration process is performed, the user's gaze position can be determined by referring to the pre-stored calibration table as described above. As described above, this is also true when the user's sight line information is not present in the calibration table. It is also clear that, even when the user gazes outside the rectangle formed by the calibration point, the position information of the point at which the user is gazing can be calculated using the linearity.

According to an embodiment of the present invention, the processor 140 may guide each of the calibration points through the display unit 110 by glowing or temporarily varying the size of the calibration point. Accordingly, the user can gaze at the corresponding calibration point, the processor 140 can acquire the user's gaze information through the gaze tracking unit 120, and can obtain the gaze information of the shaded or resized calibration point and the gaze information The calibration table can be generated by matching. However, there is no particular limitation on how to display the calibration point for the user to take.

According to FIG. 2, three planes having different distance information are shown, and nine calibration points are shown for each plane. However, this is only an example for the sake of understanding, and is not limited to the type of distance information and the number of calibration points. For example, one calibration point may be displayed on the far plane, sixteen calibration points may be displayed on one plane, and five planes having different distance information may be displayed on the screen.

3 is an exemplary view showing a three-dimensional display device 100 according to an embodiment of the present invention. FIG. 3A is a perspective view of the three-dimensional display device 100 according to an embodiment of the present invention, and FIG. 3B is a perspective view of a three-dimensional display device 100 according to an exemplary embodiment of the present invention. Fig.

Referring to FIG. 3A, the three-dimensional display device 100 according to an embodiment of the present invention may include a screen 310 of a display unit and a lens 320 of a display unit. As described above, according to the interaction between the screen 310 and the lens 320, the display unit 110 can provide a three-dimensional image or an image to the user.

The area A shown in FIG. 3A indicates an area where the camera of the gaze tracking unit 120 according to the embodiment of the present invention is disposed. As described above, since a certain distance is maintained between the camera and the user's eyes, a pupil image of the whole user can be obtained. Accordingly, the gaze tracking unit 120 can accurately track the gaze direction of the user.

Referring to FIG. 3B, the three-dimensional display device 100 according to an embodiment of the present invention may include a lens 320 and an infrared ray generator 330. As described above, the infrared ray generator 330 for emitting infrared ray in both eyes of the user can be disposed around the lens 320, and a plurality of infrared ray generators 330 can be further disposed if necessary. Accordingly, the infrared ray is emitted to the user's pupil in the direction of the user's eyes, and even if the direction of the user's gaze is slightly changed, the infrared ray generator 330 emits infrared rays constantly into the user's pupil irrespective of the part of the user's pupil . It should be noted that, as described above, the number and position of the infrared ray generators arranged in Fig. 3B are not particularly limited.

4 is a flowchart illustrating a three-dimensional display method 400 according to an embodiment of the present invention. The method shown in Fig. 4 can be performed, for example, by the three-dimensional display device 100 described above. In the illustrated flow chart, the method is described as being divided into a plurality of steps, but at least some of the steps may be performed in reverse order, combined with other steps, performed together, omitted, divided into detailed steps, One or more steps may be added and performed.

In step 402, the three-dimensional display device 100 can visually display a plurality of calibration points having different distances from the user in the three-dimensional virtual space.

In step 404, the three-dimensional display device 100 can obtain gaze information of the user gazing at the calibration point.

In step 406, the three-dimensional display device 100 may generate a calibration table including position information of the calibration point and the gaze information corresponding to the calibration point.

In step 408, when the user gazes at an object in the three-dimensional virtual space displayed on the screen, the three-dimensional display device 100 acquires gaze information of the user and refers to the calibration table to obtain the gaze information The gaze position of the corresponding user can be calculated. According to one embodiment, the user's gaze position can be calculated using the linearity of the data stored in the calibration table.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, . Therefore, the scope of the present invention should not be limited to the above-described embodiments, but should be determined by equivalents to the appended claims, as well as the appended claims.

100: three-dimensional display device
110:
120: eye tracking unit
130: Computer readable medium
140: Processor
310: Screen
320: lens
330: Infrared ray generator

Claims (12)

One or more processors;
A computer-readable medium on which one or more computer programs are recorded;
A display unit; And
And a gaze tracking unit for obtaining gaze information of a user gazing at a screen of the display unit,
Wherein the one or more computer programs are configured to be executed by the one or more processors,
The program includes:
Visually displaying at least one calibration point having a different distance from the user within the three-dimensional virtual space through the display unit;
Acquiring gaze information of the user gazing at the calibration point through the gaze tracking unit; And
And generating a calibration table including position information of the calibration point displayed and the gaze information corresponding to the calibration point,
Wherein the gaze tracking unit includes at least one camera for recognizing the pupil of the user,
The display unit may further include a lens through which a line of sight of the user passes,
Wherein the camera is disposed between the lens and the screen such that the shape of the pupil of the user recognized by the camera is circular in the traveling direction of the user's gaze.
The method according to claim 1,
Wherein the program further comprises: after performing the step of generating the calibration table,
Acquiring gaze information of a user gazing at a specific object in a three-dimensional virtual space displayed by a display unit; And
And calculating positional information corresponding to the gaze information of the user gazing at the object with reference to the calibration table.
The method of claim 2,
The step of calculating the positional information may include calculating the positional information based on the linearity based on the linearity when there is no gaze information coinciding with the gaze information of the user gazing at the object among the gaze information included in the calibration table, And calculates positional information corresponding to the visual information of the user gazing at the object from the visual information contained in the table.
The method according to claim 1,
Wherein the line-of-sight information includes at least one of an angle of view of the user's gaze position and a gaze direction of both eyes.
delete The method according to claim 1,
Wherein the gaze tracking unit further comprises one or more infrared ray generators for emitting infrared rays to both sides of the user.
delete delete delete The three-dimensional display device visually displaying at least one calibration point having a different distance from the user in the three-dimensional virtual space through the display unit;
The three-dimensional display device acquiring gaze information of the user gazing at the calibration point through a gaze tracking unit; And
And generating a calibration table including the position information of the calibration point on which the three-dimensional display device is displayed and the sight line information corresponding to the calibration point,
Wherein the three-dimensional display device further comprises at least one camera for recognizing the pupil of the user and a lens through which the user's gaze passes,
Wherein the camera is disposed between the lens and the screen on which the calibration point is displayed and is disposed in a traveling direction of the user's gaze so that the shape of the pupil of the user recognized by the camera is circular.
The method of claim 10,
After performing the step of generating the calibration table,
Acquiring gaze information of a user gazing at a specific object in a three-dimensional virtual space displayed by the display unit; And
Wherein the three-dimensional display device further comprises calculating position information corresponding to the gaze information of the user gazing at the object with reference to the calibration table.
The method of claim 11,
The step of calculating the positional information may include the step of, if there is no sight line information coinciding with the sight line information of the user gazing at the object among the sight line information included in the calibration table, And calculates positional information corresponding to gaze information of a user gazing at the object from the gaze information.
KR1020150134489A 2015-09-23 2015-09-23 Apparatus and method for 3-dimensional display KR101733519B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150134489A KR101733519B1 (en) 2015-09-23 2015-09-23 Apparatus and method for 3-dimensional display

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150134489A KR101733519B1 (en) 2015-09-23 2015-09-23 Apparatus and method for 3-dimensional display

Publications (2)

Publication Number Publication Date
KR20170037692A KR20170037692A (en) 2017-04-05
KR101733519B1 true KR101733519B1 (en) 2017-05-08

Family

ID=58586851

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150134489A KR101733519B1 (en) 2015-09-23 2015-09-23 Apparatus and method for 3-dimensional display

Country Status (1)

Country Link
KR (1) KR101733519B1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20230166292A (en) 2022-05-30 2023-12-07 엔에이치엔클라우드 주식회사 Method and system for user-customized eye tracking calibration

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20220005958A (en) * 2020-07-07 2022-01-14 삼성전자주식회사 Device and method for correcting user’s eyesight and performing calibration

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006108868A (en) * 2004-10-01 2006-04-20 Canon Inc Image display apparatus and image display system
US9030532B2 (en) * 2004-08-19 2015-05-12 Microsoft Technology Licensing, Llc Stereoscopic image display

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101691564B1 (en) 2010-06-14 2016-12-30 주식회사 비즈모델라인 Method for Providing Augmented Reality by using Tracking Eyesight

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9030532B2 (en) * 2004-08-19 2015-05-12 Microsoft Technology Licensing, Llc Stereoscopic image display
JP2006108868A (en) * 2004-10-01 2006-04-20 Canon Inc Image display apparatus and image display system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
3 차원 시선 추출 및 상호작용 기법, 방송공학회논문지 제11권4호 (2006)*

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20230166292A (en) 2022-05-30 2023-12-07 엔에이치엔클라우드 주식회사 Method and system for user-customized eye tracking calibration

Also Published As

Publication number Publication date
KR20170037692A (en) 2017-04-05

Similar Documents

Publication Publication Date Title
JP5923603B2 (en) Display device, head mounted display, calibration method, calibration program, and recording medium
KR102460047B1 (en) Head up display with eye tracking device determining user spectacles characteristics
US20220130124A1 (en) Artificial reality system with varifocal display of artificial reality content
US11861062B2 (en) Blink-based calibration of an optical see-through head-mounted display
US10623721B2 (en) Methods and systems for multiple access to a single hardware data stream
US20200201038A1 (en) System with multiple displays and methods of use
US11995847B2 (en) Glasses-free determination of absolute motion
US11057606B2 (en) Method and display system for information display based on positions of human gaze and object
US20210090323A1 (en) Rendering Computer-Generated Reality Text
KR20160096392A (en) Apparatus and Method for Intuitive Interaction
KR101733519B1 (en) Apparatus and method for 3-dimensional display
JP6446465B2 (en) I / O device, I / O program, and I / O method
JP2019102828A (en) Image processing system, image processing method, and image processing program
KR101817952B1 (en) See-through type head mounted display apparatus and method of controlling display depth thereof
US9218104B2 (en) Image processing device, image processing method, and computer program product
Kato et al. 3D Gaze on Stationary and Moving Visual Targets in Mixed Reality Environments
WO2024047990A1 (en) Information processing device
JP6206949B2 (en) Field-of-view restriction image data creation program and field-of-view restriction device using the same
WO2023195995A1 (en) Systems and methods for performing a motor skills neurological test using augmented or virtual reality
WO2023244267A1 (en) Systems and methods for human gait analysis, real-time feedback and rehabilitation using an extended-reality device
US20190114838A1 (en) Augmented reality system and method for providing augmented reality
Duchowski et al. Head-Mounted System Calibration

Legal Events

Date Code Title Description
AMND Amendment
E601 Decision to refuse application
AMND Amendment
X701 Decision to grant (after re-examination)