KR101733519B1 - Apparatus and method for 3-dimensional display - Google Patents
Apparatus and method for 3-dimensional display Download PDFInfo
- Publication number
- KR101733519B1 KR101733519B1 KR1020150134489A KR20150134489A KR101733519B1 KR 101733519 B1 KR101733519 B1 KR 101733519B1 KR 1020150134489 A KR1020150134489 A KR 1020150134489A KR 20150134489 A KR20150134489 A KR 20150134489A KR 101733519 B1 KR101733519 B1 KR 101733519B1
- Authority
- KR
- South Korea
- Prior art keywords
- user
- gaze
- information
- calibration
- calibration point
- Prior art date
Links
Images
Classifications
-
- H04N13/0425—
-
- H04N13/0402—
-
- H04N13/0468—
-
- H04N13/0484—
Landscapes
- User Interface Of Digital Computer (AREA)
Abstract
A three-dimensional display device is disclosed. A three-dimensional display device according to an embodiment of the present invention includes at least one processor, a computer readable medium having at least one computer program recorded thereon, a display unit, and a gaze tracking unit for obtaining gaze information of a user gazing at a screen of the display unit Wherein the one or more computer programs are configured to be executed by the one or more processors and the program causes the display unit to visually display at least one calibration point having a different distance from the user within the three- Acquiring gaze information of the user gazing at the calibration point through the gaze tracking unit and acquiring gaze information corresponding to the position information of the displayed calibration point and the calibration point And generating a calibration table containing the instructions.
Description
Embodiments of the invention relate to three-dimensional display technology.
Recently, the use of augmented reality equipment such as a head mounted display (HMD) has been increasing as virtual reality technology has been developed. The head mounted display is a display device which can display three-dimensional stereoscopic images directly in front of the user's eyes in the form of being mounted on the head of a user. In addition, a technique of selecting contents or inputting characters or the like by analyzing the direction and position of a user's gaze without a separate input device such as a keyboard or a mouse has been developed in response to this.
In the case of the head-mounted display, the user is configured to directly mount the head-mounted display on the head as described above. At this time, calibration environment is required to accurately measure the direction and position of the user's gaze, because the user's environment or physical characteristics are different for each user. In particular, since the head-mounted display is configured to provide the user with a three-dimensional stereoscopic image, the calibration must also be performed in a three-dimensional space. Accordingly, there has been a need for a calibration technique in a three-dimensional space.
Embodiments of the present invention are intended to provide means for calibration of a three-dimensional display device.
Embodiments of the present invention are also intended to implement a three-dimensional display device and method.
According to an exemplary embodiment of the present invention, there is provided a computer readable medium including at least one processor, a computer readable medium having recorded therein one or more computer programs, a display unit, and a gaze tracking unit for obtaining gaze information of a user gazing at a screen of the display unit, One or more computer programs are configured to be executed by the one or more processors, the program comprising the steps of visually displaying at least one calibration point having a different distance from the user within the three-dimensional virtual space through the display unit, Acquiring gaze information of the user gazing at the calibration point through the gaze tracking unit, calibrating calorie information including the position information of the calibration point displayed and the gaze information corresponding to the calibration point The three-dimensional display device including instructions for executing a method comprising: generating a migration table is provided.
The program may further include a step of acquiring gaze information of a user gazing at a specific object in the three-dimensional virtual space displayed by the display unit after performing the step of generating the calibration table, and a step of gazing the object with reference to the calibration table And calculating position information corresponding to the gaze information of the user.
The step of calculating the positional information may include calculating the positional information based on the linearity based on the linearity when there is no gaze information coinciding with the gaze information of the user gazing at the object among the gaze information included in the calibration table, The position information corresponding to the gaze information of the user gazing at the object can be calculated from the gaze information included in the table.
The gaze information may include at least one of a gaze position of the user and an angle difference in a gaze direction of both eyes.
Wherein the gaze tracking unit includes at least one camera for recognizing the pupil of the user, and the display unit further includes a lens through which the user's gaze passes, wherein the camera is positioned between the lens and the screen, or between the lens and the user And can be disposed between both eyes.
The gaze tracking unit may further include at least one infrared ray generator for emitting infrared rays to both sides of the user.
According to another exemplary embodiment of the present invention, there is provided a method for displaying a three-dimensional virtual space, the method comprising the steps of: visually displaying at least one calibration point having a different distance from a user within a three- The three-dimensional display device includes a step of acquiring gaze information of the user gazing at the calibration point through the gaze tracking unit, and the gaze information corresponding to the position information of the calibration point and the calibration point A computer program stored in a computer-readable recording medium for executing the step of generating a calibration table is provided.
Wherein the three-dimensional display device obtains visual information of a user gazing at a specific object in a three-dimensional virtual space displayed by the display unit after performing the step of generating the calibration table, And calculating position information corresponding to the gaze information of the user gazing at the object with reference to the position information of the user.
The step of calculating the positional information may include the step of, if there is no sight line information coinciding with the sight line information of the user gazing at the object among the sight line information included in the calibration table, The position information corresponding to the gaze information of the user gazing at the object can be calculated from the gaze information.
According to another exemplary embodiment of the present invention, a three-dimensional display device visually displays at least one calibration point having a different distance from a user in a three-dimensional virtual space through a display unit, Acquiring gaze information of the user gazing at the calibration point through a gaze tracking unit and generating a calibration table including position information of the calibration point displayed on the three-dimensional display device and the gaze information corresponding to the calibration point The method comprising the steps of:
Wherein the three-dimensional display device obtains visual information of a user gazing at a specific object in a three-dimensional virtual space displayed by the display unit after performing the step of generating the calibration table, And calculating position information corresponding to the gaze information of the user gazing at the object with reference to the gaze information.
The step of calculating the positional information may include the step of, if there is no sight line information coinciding with the sight line information of the user gazing at the object among the sight line information included in the calibration table, The position information corresponding to the gaze information of the user gazing at the object can be calculated from the gaze information.
According to the embodiments of the present invention, in the calibration process of the three-dimensional display device, the procedure is simplified by utilizing the user's gaze information itself without a separate calculation procedure for calculating the gaze position of the user from the gaze information of the user. In addition, it is possible to prevent errors caused by performing additional operation procedures.
According to another embodiment of the present invention, a camera for acquiring images of both sides of a user is arranged to be spaced apart from a user's eyes by a predetermined distance, thereby obtaining an undistorted image of the user's eyes, It is possible to track the direction of the eye accurately.
1 is a block diagram showing a configuration of a three-dimensional display device according to an embodiment of the present invention.
Figure 2 is an illustration of a calibration point according to one embodiment of the present invention.
3 is an exemplary diagram illustrating a three-dimensional display device according to an embodiment of the present invention.
4 is a flowchart illustrating a three-dimensional display method according to an embodiment of the present invention.
Hereinafter, specific embodiments of the present invention will be described with reference to the drawings. The following detailed description is provided to provide a comprehensive understanding of the methods, apparatus, and / or systems described herein. However, this is merely an example and the present invention is not limited thereto.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings. In the following description, well-known functions or constructions are not described in detail since they would obscure the invention in unnecessary detail. The following terms are defined in consideration of the functions of the present invention, and may be changed according to the intention or custom of the user, the operator, and the like. Therefore, the definition should be based on the contents throughout this specification. The terms used in the detailed description are intended only to describe embodiments of the invention and should in no way be limiting. Unless specifically stated otherwise, the singular form of a term includes plural forms of meaning. In this description, the expressions "comprising" or "comprising" are intended to indicate certain features, numbers, steps, operations, elements, parts or combinations thereof, Should not be construed to preclude the presence or possibility of other features, numbers, steps, operations, elements, portions or combinations thereof.
1 is a block diagram showing a configuration of a three-
1, a three-
The
The content may be various types of data such as a moving image, an image, a game, a web page, a virtual reality, and an augmented reality which can be visually displayed on the screen using an image display means. In one embodiment, the content may be an image or a video having depth information, such as a three-dimensional image or a multi-depth image. The
In one embodiment, the
Meanwhile, according to the embodiment, the
By the operation of the processor 140 to be described later, the
The
The
According to one embodiment of the present invention, the camera may be disposed between the lens of the
According to an embodiment of the present invention, in the calibration process, the processor 140 may display a plurality of calibration points having different distance information through the display unit, and when the user stare at the calibration point, ) Can acquire gaze information of both eyes of the user for each calibration point to be examined by the user. At this time, the gaze information may include the gaze position of the user, the angle difference between the gaze direction of the user's eyes, and the like, as described above. According to an embodiment of the present invention, in the calibration process, the
The
The computer-readable medium 130 is an apparatus that records data or programs, etc., executed by a computing device. According to one embodiment of the present invention, the computer-readable medium may include program instructions, local data files, local data structures, etc., alone or in combination. Computer-readable media may be those specially designed and constructed for the present invention, or may be those that are commonly used in the computer software arts. Examples of computer-readable media include magnetic media such as hard disks, floppy disks and magnetic tape, optical recording media such as CD-ROMs and DVDs, and specifically configured to store and execute program instructions such as ROM, RAM, flash memory, Hardware devices. Examples of such programs may include machine language code such as those produced by a compiler, as well as high-level language code that may be executed by a computer using an interpreter or the like. For example, the computer readable medium 130 may include an algorithm and / or a calibration table for the processor 140 to perform the calibration process.
Processor 140 may be, but is not limited to, a central processing unit (CPU), for example, for executing software or controlling hardware using computer-readable medium 130. According to one embodiment of the present invention, the processor 140 may execute at least one program stored in the computer- For example, the processor 140 may visually display at least one calibration point having a different distance from the user in the three-dimensional virtual space through the
The calibration table means data obtained by mapping the position information of at least one calibration point separated by a different distance from the user and the sight line information of the user corresponding to each of the calibration points. Accordingly, the calibration table includes at least one positional information and corresponding gaze information of the user. According to an embodiment of the present invention, when the user gazes at the content displayed in the three-dimensional virtual space after the calibration process is performed, the
As an example, new position information can be calculated using the proportional relationship of two or more stored gaze information values. For example, let's take a case where the angular difference of the viewing direction of both eyes of the user stored in the calibration table is 0.1 ° and 0.3 ° by gazing at the calibration point in the calibration process. At this time, if the angle difference between the eyes of the user gazing at the object is 0.2 deg., The data stored in the calibration table can not be used as it is. Therefore, from the gaze information stored in the calibration table, it is possible to calculate the distance information of the point where the user is gazing based on the linearity. That is, the intermediate value (distance information corresponding to 0.2 degrees) of the distance information corresponding to 0.1 degrees and 0.3 degrees on the calibration table is the distance information of the point where the user looks. However, the above-described method of calculating the distance information is an example, and the method of calculating the distance information of the point where the user is gazing is not limited thereto.
In addition, the position information on the two-dimensional position of the point at which the user is gazing may be calculated based on the linearity of two or more gaze direction information stored in the calibration table as well as the method of calculating the distance information of the point where the user is gazing, But can also be calculated, for example, by the method of bilinear interpolation.
According to an exemplary embodiment of the present invention, when a plurality of overlapping screen windows are displayed in a three-dimensional virtual space, the area of the screen window disposed at the rear is smaller than that of the screen window disposed at the front It is difficult for the three-
In one embodiment, the
2 is an exemplary view illustrating calibration points according to an embodiment of the present invention. As shown in FIG. 2, when performing the calibration process, the processor can display the calibration point through the display unit. At this time, a plurality of calibration points having different distances from the user may be displayed to obtain the three-dimensional position information of the user's line of sight. In FIG. 2, three planes separated from the user by a short distance, a medium distance, and a long distance, and nine calibration points are shown for each plane. In this case, it is possible to map the line of sight information of the user who takes the calibration point for each calibration point. The mapped location information and sight line information can constitute a calibration table. When the user gazes at an object in the virtual space displayed on the screen after the calibration process is performed, the user's gaze position can be determined by referring to the pre-stored calibration table as described above. As described above, this is also true when the user's sight line information is not present in the calibration table. It is also clear that, even when the user gazes outside the rectangle formed by the calibration point, the position information of the point at which the user is gazing can be calculated using the linearity.
According to an embodiment of the present invention, the processor 140 may guide each of the calibration points through the
According to FIG. 2, three planes having different distance information are shown, and nine calibration points are shown for each plane. However, this is only an example for the sake of understanding, and is not limited to the type of distance information and the number of calibration points. For example, one calibration point may be displayed on the far plane, sixteen calibration points may be displayed on one plane, and five planes having different distance information may be displayed on the screen.
3 is an exemplary view showing a three-
Referring to FIG. 3A, the three-
The area A shown in FIG. 3A indicates an area where the camera of the
Referring to FIG. 3B, the three-
4 is a flowchart illustrating a three-
In
In
In
In
While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, . Therefore, the scope of the present invention should not be limited to the above-described embodiments, but should be determined by equivalents to the appended claims, as well as the appended claims.
100: three-dimensional display device
110:
120: eye tracking unit
130: Computer readable medium
140: Processor
310: Screen
320: lens
330: Infrared ray generator
Claims (12)
A computer-readable medium on which one or more computer programs are recorded;
A display unit; And
And a gaze tracking unit for obtaining gaze information of a user gazing at a screen of the display unit,
Wherein the one or more computer programs are configured to be executed by the one or more processors,
The program includes:
Visually displaying at least one calibration point having a different distance from the user within the three-dimensional virtual space through the display unit;
Acquiring gaze information of the user gazing at the calibration point through the gaze tracking unit; And
And generating a calibration table including position information of the calibration point displayed and the gaze information corresponding to the calibration point,
Wherein the gaze tracking unit includes at least one camera for recognizing the pupil of the user,
The display unit may further include a lens through which a line of sight of the user passes,
Wherein the camera is disposed between the lens and the screen such that the shape of the pupil of the user recognized by the camera is circular in the traveling direction of the user's gaze.
Wherein the program further comprises: after performing the step of generating the calibration table,
Acquiring gaze information of a user gazing at a specific object in a three-dimensional virtual space displayed by a display unit; And
And calculating positional information corresponding to the gaze information of the user gazing at the object with reference to the calibration table.
The step of calculating the positional information may include calculating the positional information based on the linearity based on the linearity when there is no gaze information coinciding with the gaze information of the user gazing at the object among the gaze information included in the calibration table, And calculates positional information corresponding to the visual information of the user gazing at the object from the visual information contained in the table.
Wherein the line-of-sight information includes at least one of an angle of view of the user's gaze position and a gaze direction of both eyes.
Wherein the gaze tracking unit further comprises one or more infrared ray generators for emitting infrared rays to both sides of the user.
The three-dimensional display device acquiring gaze information of the user gazing at the calibration point through a gaze tracking unit; And
And generating a calibration table including the position information of the calibration point on which the three-dimensional display device is displayed and the sight line information corresponding to the calibration point,
Wherein the three-dimensional display device further comprises at least one camera for recognizing the pupil of the user and a lens through which the user's gaze passes,
Wherein the camera is disposed between the lens and the screen on which the calibration point is displayed and is disposed in a traveling direction of the user's gaze so that the shape of the pupil of the user recognized by the camera is circular.
After performing the step of generating the calibration table,
Acquiring gaze information of a user gazing at a specific object in a three-dimensional virtual space displayed by the display unit; And
Wherein the three-dimensional display device further comprises calculating position information corresponding to the gaze information of the user gazing at the object with reference to the calibration table.
The step of calculating the positional information may include the step of, if there is no sight line information coinciding with the sight line information of the user gazing at the object among the sight line information included in the calibration table, And calculates positional information corresponding to gaze information of a user gazing at the object from the gaze information.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150134489A KR101733519B1 (en) | 2015-09-23 | 2015-09-23 | Apparatus and method for 3-dimensional display |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150134489A KR101733519B1 (en) | 2015-09-23 | 2015-09-23 | Apparatus and method for 3-dimensional display |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20170037692A KR20170037692A (en) | 2017-04-05 |
KR101733519B1 true KR101733519B1 (en) | 2017-05-08 |
Family
ID=58586851
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150134489A KR101733519B1 (en) | 2015-09-23 | 2015-09-23 | Apparatus and method for 3-dimensional display |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101733519B1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20230166292A (en) | 2022-05-30 | 2023-12-07 | 엔에이치엔클라우드 주식회사 | Method and system for user-customized eye tracking calibration |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20220005958A (en) * | 2020-07-07 | 2022-01-14 | 삼성전자주식회사 | Device and method for correcting user’s eyesight and performing calibration |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006108868A (en) * | 2004-10-01 | 2006-04-20 | Canon Inc | Image display apparatus and image display system |
US9030532B2 (en) * | 2004-08-19 | 2015-05-12 | Microsoft Technology Licensing, Llc | Stereoscopic image display |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101691564B1 (en) | 2010-06-14 | 2016-12-30 | 주식회사 비즈모델라인 | Method for Providing Augmented Reality by using Tracking Eyesight |
-
2015
- 2015-09-23 KR KR1020150134489A patent/KR101733519B1/en active IP Right Grant
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9030532B2 (en) * | 2004-08-19 | 2015-05-12 | Microsoft Technology Licensing, Llc | Stereoscopic image display |
JP2006108868A (en) * | 2004-10-01 | 2006-04-20 | Canon Inc | Image display apparatus and image display system |
Non-Patent Citations (1)
Title |
---|
3 차원 시선 추출 및 상호작용 기법, 방송공학회논문지 제11권4호 (2006)* |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20230166292A (en) | 2022-05-30 | 2023-12-07 | 엔에이치엔클라우드 주식회사 | Method and system for user-customized eye tracking calibration |
Also Published As
Publication number | Publication date |
---|---|
KR20170037692A (en) | 2017-04-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5923603B2 (en) | Display device, head mounted display, calibration method, calibration program, and recording medium | |
KR102460047B1 (en) | Head up display with eye tracking device determining user spectacles characteristics | |
US20220130124A1 (en) | Artificial reality system with varifocal display of artificial reality content | |
US11861062B2 (en) | Blink-based calibration of an optical see-through head-mounted display | |
US10623721B2 (en) | Methods and systems for multiple access to a single hardware data stream | |
US20200201038A1 (en) | System with multiple displays and methods of use | |
US11995847B2 (en) | Glasses-free determination of absolute motion | |
US11057606B2 (en) | Method and display system for information display based on positions of human gaze and object | |
US20210090323A1 (en) | Rendering Computer-Generated Reality Text | |
KR20160096392A (en) | Apparatus and Method for Intuitive Interaction | |
KR101733519B1 (en) | Apparatus and method for 3-dimensional display | |
JP6446465B2 (en) | I / O device, I / O program, and I / O method | |
JP2019102828A (en) | Image processing system, image processing method, and image processing program | |
KR101817952B1 (en) | See-through type head mounted display apparatus and method of controlling display depth thereof | |
US9218104B2 (en) | Image processing device, image processing method, and computer program product | |
Kato et al. | 3D Gaze on Stationary and Moving Visual Targets in Mixed Reality Environments | |
WO2024047990A1 (en) | Information processing device | |
JP6206949B2 (en) | Field-of-view restriction image data creation program and field-of-view restriction device using the same | |
WO2023195995A1 (en) | Systems and methods for performing a motor skills neurological test using augmented or virtual reality | |
WO2023244267A1 (en) | Systems and methods for human gait analysis, real-time feedback and rehabilitation using an extended-reality device | |
US20190114838A1 (en) | Augmented reality system and method for providing augmented reality | |
Duchowski et al. | Head-Mounted System Calibration |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AMND | Amendment | ||
E601 | Decision to refuse application | ||
AMND | Amendment | ||
X701 | Decision to grant (after re-examination) |