CN108632600B - Method and device for playing stereoscopic image by reflecting user interaction information - Google Patents

Method and device for playing stereoscopic image by reflecting user interaction information Download PDF

Info

Publication number
CN108632600B
CN108632600B CN201810156558.7A CN201810156558A CN108632600B CN 108632600 B CN108632600 B CN 108632600B CN 201810156558 A CN201810156558 A CN 201810156558A CN 108632600 B CN108632600 B CN 108632600B
Authority
CN
China
Prior art keywords
user
image
sub
viewed
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201810156558.7A
Other languages
Chinese (zh)
Other versions
CN108632600A (en
Inventor
李光淳
朴英洙
申泓昌
阴淏珉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Electronics and Telecommunications Research Institute ETRI
Original Assignee
Electronics and Telecommunications Research Institute ETRI
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Electronics and Telecommunications Research Institute ETRI filed Critical Electronics and Telecommunications Research Institute ETRI
Publication of CN108632600A publication Critical patent/CN108632600A/en
Application granted granted Critical
Publication of CN108632600B publication Critical patent/CN108632600B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/383Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]

Abstract

The application provides a method and a device for playing a stereoscopic image by reflecting user interaction information. The stereoscopic image playing method according to an embodiment of the present application may include: a step of outputting at least one image including a plurality of sub-images and forming a focus based on a combination of the plurality of sub-images; confirming user interaction information of a user viewing the at least one image; a step of controlling the plurality of sub-images included in the at least one image based on the user interaction information, thereby correcting the focus; and a step of outputting the at least one image with the focus corrected.

Description

Method and device for playing stereoscopic image by reflecting user interaction information
Technical area
The present invention relates to a method and apparatus for displaying a stereoscopic image, and more particularly, to a method and apparatus for controlling a stereoscopic image in response to user Interaction.
Background
A human being is affected by various visual elements in the process of recognizing 3D stereoscopic vision. The representative examples are as follows. Since a person has eyes on the left and right sides, binocular disparity (binocular disparity) information is generated in an observed outside world image and recognized as a 3D stereoscopic image in the brain. The stereoscopic perception principle based on binocular parallax is applied to popular 3D stereoscopic display devices. The display device outputs images to be input to both eyes of the user, and the user wears a device (e.g., 3D stereoscopic glasses) for separating left/right images corresponding to the respective eyes, so that 3D stereoscopic image content can be viewed.
Various display devices to which such a stereoscopic image technology can be applied are currently being studied, and particularly, development of a stereoscopic image display device of a wearable display form is actively being carried out based on fusion with a virtual reality/augmented reality/mixed reality technology.
Disclosure of Invention
In order to maximize the reality of the stereoscopic image, it is necessary to adjust the direction of a scene or an object included in the stereoscopic image or adjust the focus according to the user's action, i.e., the movement or motion of the user.
However, in order to adjust the output of the stereoscopic image in accordance with the user's action, it is necessary to process operations such as processing the reconstructed image data in accordance with the user's action, and therefore, there is a problem that a large amount of data needs to be processed or calculated.
The subject of the application is to provide a method and a device which can effectively form and play an image reflecting the interaction of a user.
Another object of the present invention is to provide a method and apparatus for realizing a stereoscopic image that reflects user interaction while minimizing data processing or arithmetic processing.
The problems to be achieved by the present application are not limited to the above-mentioned problems, and other problems not mentioned can be clearly understood by those skilled in the art to which the present application belongs from the following description.
According to an aspect of the present application, a stereoscopic image playing method may be provided. The method may comprise the steps of: a step of outputting at least one image including a plurality of sub-images and forming a focus based on a combination of the plurality of sub-images; confirming user interaction information of a user viewing the at least one image; a step of controlling the plurality of sub-images included in the at least one image based on the user interaction information, thereby correcting the focus; and a step of outputting the at least one image with the focus corrected.
According to another aspect of the present application, a stereoscopic image playback device may be provided. The apparatus may include: a stereoscopic image display section for outputting a stereoscopic image having at least one image forming a focus based on a combination of a plurality of sub-images; a user interaction information confirming unit for confirming user interaction information of a user viewing the stereoscopic image; and a focus control section that controls the plurality of sub-images based on the user interaction information, thereby correcting the focus.
The features summarized above are merely exemplary aspects of the detailed description of the application, which is described later, and are not intended to limit the scope of the application.
According to the present application, a method and apparatus capable of effectively composing and playing back an image reflecting user interaction can be provided.
Further, according to the present application, it is possible to provide a method and apparatus for realizing a stereoscopic image reflecting user interaction while minimizing data processing or arithmetic processing.
The effects that can be achieved by the present application are not limited to the above-mentioned effects, and other effects that are not mentioned can be clearly understood by those skilled in the art to which the present application pertains from the following description.
Drawings
Fig. 1 is a block diagram showing a configuration of a stereoscopic image playback device according to an embodiment of the present application.
Fig. 2 is a diagram illustrating an example of a stereoscopic image output method output by a display unit included in a stereoscopic image playback device according to an embodiment of the present application.
Fig. 3 is a diagram illustrating image information constituting a stereoscopic image output from fig. 2.
Fig. 4 is a diagram illustrating a stereoscopic image combined based on a plurality of sub-images of fig. 3.
Fig. 5 is a diagram illustrating an operation of a focus control unit included in the stereoscopic image playback device according to the embodiment of the present application.
Fig. 6 is a diagram illustrating a stereoscopic image output by the stereoscopic image playback device according to an embodiment of the present application.
Fig. 7 is a flowchart illustrating a sequence of a stereoscopic image playback method according to an embodiment of the present application.
Detailed Description
Hereinafter, embodiments of the present application will be described in detail with reference to the accompanying drawings so that those skilled in the art to which the present application pertains can easily carry out the present invention. However, the present application can be realized in various forms and is not limited to the embodiments described herein.
When it is determined that the detailed description of the known configurations and functions may obscure the gist of the present application in the description of the embodiments of the present application, the detailed description thereof will be omitted. In the drawings, portions that are not related to the description of the present application are omitted, and similar portions are given similar reference numerals.
In the present application, when a certain component is referred to as being "connected", "coupled", or "joined" to another component, this includes not only a direct connection but also an indirect connection in which another component is present therebetween. In the case where a certain component is described as "including" or "having" another component, it means that the other component may be included without excluding the other component unless otherwise stated.
In the present application, the terms first, second, and the like are used only for the purpose of distinguishing one constituent element from another constituent element, and when not particularly mentioned, the order, importance, and the like of the constituent elements are not limited. Therefore, within the scope of the present application, a first component in one embodiment may be referred to as a second component in another embodiment, and similarly, a second component in one embodiment may be referred to as a first component in another embodiment.
In the present application, the components that are distinguished from each other are only for the purpose of clearly explaining the respective features, and do not mean that the components must be separated from each other. That is, a plurality of components may be combined and realized by one unit of hardware or software, or one component may be dispersed and realized by a plurality of units of hardware or software. Therefore, these combined or dispersed embodiments are included in the scope of the present application even if not otherwise mentioned.
In the present application, the constituent elements described in the various embodiments do not mean that they are essential constituent elements, and some of them may be optional constituent elements. Therefore, an embodiment including a partial set of the constituent elements described in one embodiment is also included in the scope of the present application. Further, an embodiment in which other components are added to the components described in the various embodiments is also included in the scope of the present application.
Technical terms used in the embodiments of the present application are defined as follows.
The user interaction information is information used to control a focus of a stereoscopic image, and includes information indicating a direction in which the user's sight line is viewed and an area in which the user's sight line is viewed.
The stereoscopic image includes a plurality of sub-images considering Parallax (disparity) and visual disparity (parallelax) in the horizontal or vertical direction. Therefore, the plurality of sub-images have mutually different horizontal or vertical directional disparities (disparities) and visual disparities (parallelax), respectively.
Hereinafter, embodiments of the present application will be described with reference to the drawings.
Fig. 1 is a block diagram showing a configuration of a stereoscopic image playback device according to an embodiment of the present application.
Referring to fig. 1, the stereoscopic image playback device according to an embodiment of the present invention may include a display unit 11, a user interaction information confirmation unit 13, and a focus control unit 15.
The display section 11 may include a display panel for outputting the input stereoscopic image data. In particular, the display unit 11 may use two left and right display panels, a Parallax barrier (Parallax barrier) for separating Parallax, a lenticular lens (lenticular lenses), a Micro lens (Micro lenses), a line light source, and the like, in order to display a three-dimensional stereoscopic image.
Fig. 2 is a diagram illustrating a stereoscopic image output method output by a display unit included in a stereoscopic image playback device according to an embodiment of the present application, and fig. 3 is a diagram illustrating image information constituting a stereoscopic image output from fig. 2.
Referring to fig. 2, the display portion 11 may include a left-eye display panel 110L and a right-eye display panel 110R corresponding to the left eye and the right eye, respectively, and may further include a left-eye lens 112L disposed between the left-eye and left-eye display panels 110L and a right-eye lens 112R disposed between the right-eye and right-eye display panels 110R.
According to this structure, the left image 200L output from the left-eye display panel 110L can be provided to the left eye of the user via the left-eye lens 112L, and the right image 200R output from the right-eye display panel 110R can be provided to the right eye of the user via the right-eye lens 112R. Here, the focal length 120L of the left-eye lens 112L may be set to be greater than the distance 130L between the left-eye display panel 110L and the left-eye lens 112L, and the focal length 120R of the right-eye lens 112R may be set to be greater than the distance 130R between the right-eye display panel 110R and the right-eye lens 112R.
According to this structure, the user can recognize that the left image 200L and the right image 200R are formed on the virtual image plane (virtual image plane) 150.
Further, the left image 200L and the right image 200R may be provided with at least one object, and the at least one object may form a predetermined parallax (disparity) in the left image 200L and the right image 200R. As such, the at least one object is recognized to be closer to the virtual image plane 150 according to the parallax on the left image 200L and the right image 200R.
The aforementioned manner of realizing a stereoscopic image based on the parallax on the left image 200L and the right image 200R is a manner of providing a stereoscopic image with reference to a single viewpoint (view). Further, in order to realize a Multi-View (Multi-View) image, each of the left-eye display panel 110L and the right-eye display panel 110R may include at least one Parallax barrier (Parallax barrier), and a Multi-View stereoscopic image may be realized by the Parallax barrier.
In the foregoing embodiment, the parallax in the horizontal direction is exemplified by the parallax (disparity) and the motion parallax (motion parallax) on the left image 200L and the right image 200R, but in order to express the full-visual disparity (full-parallax) in the light-field (light-field), an image that can reflect the parallax in the vertical direction is required.
Accordingly, the left image 200L and the right image 200R may have sub-images that can provide parallax in the horizontal and vertical directions, respectively.
As can be seen from fig. 3 illustrating image information for constituting a stereoscopic image, the stereoscopic image may have a plurality of sub-images 301, 302, 303, and 304. The plurality of sub-images are based on the parallax in the horizontal and vertical directions that make up the image. The number of sub-images possessed by the stereoscopic image may correspond to a product of an amount of horizontal-direction viewpoints and an amount of vertical-direction viewpoints.
For example, the stereoscopic image may include an Epipolar Plane Image (EPI) image.
In fig. 3, the X-axis and the Y-axis represent horizontal and vertical resolutions, respectively. For example, when the View (View) axis indicates a horizontal View, X represents a horizontal resolution and Y represents a vertical resolution, and when the View (View) axis is a vertical View, X represents a vertical resolution and Y represents a horizontal resolution.
In addition, at least one object may be included in the stereoscopic image, and the at least one object may represent mutually different parallaxes D1, D2, and D3 among the plurality of sub-images 301, 302, 303, and 304.
Fig. 4 is a diagram illustrating a stereoscopic image 400 combined based on a plurality of sub-images 301, 302, 303, 304 of fig. 3.
The plurality of sub-images 301, 302, 303, 304 may include a first object 410 and a second object 420, the first object 410 may be disposed at the same position in the plurality of sub-images 301, 302, 303, 304 without parallax, and the second object 420 may include predetermined parallaxes D1, D2, D3 in the plurality of sub-images 301, 302, 303, 304. Thus, in the stereoscopic image 400, the area where the first object 410 exists may appear as the in-focus host viewpoint, and the area where the second object 420 exists may appear as the out-of-focus area.
In addition, the user interaction information confirming section 13 may be used to confirm the user interaction information.
In an embodiment of the present application, the user interaction information is information used to control a focus of a stereoscopic image, and may include information indicating a direction in which the user's sight line is viewed and an area in which the user's sight line is viewed.
The user interaction information may contain information indicating a head position of the user and a head direction of the user.
And, the user interaction information may include information indicating a pupil position, a pupil size, and a pupil direction of the user.
The display part 11 may include a Head Mounted Display (HMD) as a personal terminal, and the user interaction information confirmation part 13 may include: a motion sensor for confirming the movement in the 3-axis direction; and a movement confirmation unit configured to confirm information indicating a direction in which the line of sight of the user is viewed and an area in which the line of sight of the user is viewed, based on the sensor data detected by the movement sensor.
As another example, the display unit 11 may include a display device having a predetermined size. At this time, the user interaction information confirmation part 13 may include: a camera for photographing a user who looks at the display part 11; a motion confirmation unit which detects a specific area (for example, a head area) of a user based on an image captured by the camera, and confirms a direction in which the line of sight of the user is viewed and the area in which the line of sight of the user is viewed by detecting movement of the specific area. The motion confirmation unit may detect a region where the user's eyes (or pupils) are located, and detect the size and direction of the pupils.
The focus control part 15 detects a sub-image of a viewpoint corresponding to the user interaction information confirmed by the user interaction information confirming part 13. The focus control unit 15 controls the output of the sub-image to control the focus of the stereoscopic image output by the display unit 11.
For example, the focus control section 15 may detect a sub-image of a viewpoint corresponding to a direction in which the line of sight of the user confirmed by the user interaction information confirmation section 13 is viewed and an area in which the line of sight of the user is viewed. The sub-images may include at least one sub-image to be output as a left image and at least one sub-image to be output as a right image. In this manner, by controlling the output position of at least one sub image included in the left image and at least one sub image included in the right image, the focus for at least one object included in the stereoscopic image can be adjusted. At this time, the focus controller 15 may adjust the focus for at least one object included in the stereoscopic image by controlling the main viewpoint of the sub image included in the left image and the sub image included in the right image, or controlling the parallax between the sub image included in the left image and the sub image included in the right image.
Fig. 5 is a diagram illustrating an operation of a focus control unit included in the stereoscopic image playback device according to the embodiment of the present application.
The focus control unit 15 can detect, from the plurality of sub-images 510L-1, 510L-2, 510L-3, and.. 510L-n (n is a positive integer) included in the left image and the plurality of sub-images 510R-1, 510R-2, 510R-3, and.. 510R-n included in the right image, sub-images of the viewpoint corresponding to the direction in which the line of sight of the user is viewed and the region in which the line of sight of the user is viewed, which are confirmed by the user interaction information confirmation unit 13, for example, the plurality of sub-images 510R-1, 510R-2, and 510R-3 included in the left image and the plurality of sub-images 510R-1, 510R-2, and 510R-3 included in the right image. The focus controller 15 may shift (shift) each line (line) so that the regions corresponding to the main viewpoint in the sub-images 510L-1, 510L-2, and 510L-3 included in the left image and the sub-images 510R-1, 510R-2, and 510R-3 included in the right image are aligned in the vertical direction.
The focus controller 15 may output the sub-images 510L-1, 510L-2, and 510L-3 of the left image and the plurality of sub-images (510R-1, 510R-2, and 510R-3) of the right image, the main viewpoints of which are moved and aligned.
In this way, when the sub-images 510L-1, 510L-2, 510L-3 of the left image and the plurality of sub-images 510R-1, 510R-2, 510R-3 of the right image, of which the main view points are moved and aligned, are output, the focus of the object included in the stereoscopic image can be adjusted (refer to FIG. 6).
Further, the focus controller 15 may generate at least one sub-image 510L-1', 510L-2', 510L-3', 510R-1', 510R-2', 510R-3' corresponding to the virtual viewpoint by an interpolation operation using the sub-images 510L-1, 510L-2, 510L-3 of the left image and the plurality of sub-images 510R-1, 510R-2, 510R-3 of the right image, the main viewpoint of which is moved and aligned. Also, the focus control section 15 may filter-process at least one sub-image for the virtual viewpoint.
The focus control unit 15 may reconstruct the subimages by cutting (cropping) or combining (adding) the subimages 510L-1, 510L-2, 510L-3 included in the left image and the plurality of subimages 510R-1, 510R-2, 510R-3 included in the right image, which are moved and aligned in the main viewpoint direction and the region viewed by the user's line of sight.
Further, the focus control section 15 may also reconstruct the sub-image by cutting (cropping) or combining (adding) the sub-image generated through interpolation and filtering.
Fig. 7 is a flowchart illustrating a sequence of a stereoscopic image playback method according to an embodiment of the present application.
The stereoscopic image playing method according to an embodiment of the present application can be implemented by the aforementioned stereoscopic image playing apparatus. The stereoscopic image playback device may include a device using two left and right display panels that separate Parallax in order to display a three-dimensional stereoscopic image, a Parallax barrier (Parallax barrier), lenticular lenses (lenticular lenses), Micro lenses (Micro lenses), a line light source, and the like.
Referring to fig. 7, the stereoscopic image playback device may confirm and output stereoscopic image data (S701). The stereo image may be provided with at least one sub-image.
The stereoscopic image may be provided with sub-images required to provide parallax in horizontal and vertical directions. Still further, the stereoscopic image may include a left image provided to a left eye of a user and a right image provided to a right eye of the user, and the left image and the right image may include sub-images, respectively.
In such an environment, the stereoscopic image playback apparatus can confirm the sub-images required for outputting the stereoscopic image. The plurality of sub-images are blurred based on the parallax in the horizontal and vertical directions constituting the image, and thus the number of sub-images included in the stereoscopic image may correspond to the product of the amount of the horizontal-direction viewpoint and the amount of the vertical-direction viewpoint.
For example, the stereoscopic image is illustrated as shown in fig. 3 and 4. At least one object may be provided in the stereoscopic image, and the at least one object may exhibit a parallax D1, D2, D3 different from each other among the plurality of sub images 301, 302, 303, 304. The focus of the object in the stereoscopic image 400 can be controlled according to the parallax of the sub-images 301, 302, 303, and 304. For example, the plurality of sub-images 301, 302, 303, 304 may include a first object 410 and a second object 420, the first object 410 may be disposed at the same position in the plurality of sub-images 301, 302, 303, 304 without parallax, and the second object 420 may include predetermined parallaxes D1, D2, D3 in the plurality of sub-images 301, 302, 303, 304. Thus, in the stereoscopic image 400, the area where the first object 410 exists may be presented as an in-focus host viewpoint, and the area where the second object 420 exists may be presented as an out-of-focus area.
Further, in step S702, the stereoscopic image playback device may confirm the user interaction information.
In an embodiment of the present application, the user interaction information is information used to control a focus of a stereoscopic image, and may include information indicating a direction in which the user's sight line is viewed and an area in which the user's sight line is viewed.
The user interaction information may contain information indicating a head position of the user and a head direction of the user.
And, the user interaction information may include information indicating a pupil position, a pupil size, and a pupil direction of the user.
The stereoscopic image playback device may be provided in the form of a Head Mounted Display (HMD) as a personal terminal, and may include a motion sensor for confirming movement in the 3-axis direction. Thus, the stereoscopic image playback device can confirm information indicating the direction in which the line of sight of the user is viewed and the area in which the line of sight of the user is viewed, based on the sensor data detected by the motion sensor.
As another example, the stereoscopic image playback device may include a display device of a predetermined size. At this time, the stereoscopic image playing device may include a camera for photographing a user, and detect a specific region (for example, a head region) of the user based on an image photographed by the camera, and by detecting a movement of the specific region, information indicating a direction in which the user's sight line is viewed and the region in which the user's sight line is viewed may be confirmed. Furthermore, the stereoscopic image playing device may further detect an area where the eyes (or pupils) of the user are located from the image captured by the camera, and detect the size and direction of the pupils.
Next, the stereoscopic image playback device detects a sub-image of a viewpoint corresponding to the user interaction information (S703), and reconstructs the sub-image according to the user interaction information (S704).
Specifically, the stereoscopic image playback device may detect the sub-image of the viewpoint corresponding to the confirmed direction in which the line of sight of the user is viewed and the area in which the line of sight of the user is viewed. The sub-images may include at least one sub-image to be output as a left image and at least one sub-image to be output as a right image. Further, by controlling the output position, output size, arrangement, and the like of at least one sub image included in the left image and at least one sub image included in the right image as described above, the sub images can be reconstructed.
In this manner, by reconstructing the sub-image in step S704, the focus on at least one object included in the stereoscopic image can be adjusted.
Referring to fig. 5, the operations of steps S703 and S704 are illustrated in more detail.
The stereoscopic image playing apparatus can detect the sub-images of the viewpoint corresponding to the user interaction information, for example, the sub-images 510L-1, 510L-2, 510L-3 of the left image and the sub-images 510R-1, 510R-2, 510R-3 of the right image, based on the plurality of sub-images 510L-1, 510L-2, 510L-3,. 510L-n of the left image and the plurality of sub-images 510R-1, 510R-2, 510R-3 of the right image included in the stereoscopic image.
The stereoscopic image playback apparatus may reconstruct a sub-image in which respective lines (lines) are shifted (shift) so that regions corresponding to the main viewpoint among the sub-images 510L-1, 510L-2, 510L-3 of the left image and the plurality of sub-images 510R-1, 510R-2, 510R-3 of the right image are aligned in the vertical direction.
Also, the stereoscopic image playing device may generate at least one sub-image for a virtual viewpoint by processing an interpolation (interpolation) operation using the sub-images 510L-1, 510L-2, 510L-3 of the left image and the plurality of sub-images 510R-1, 510R-2, 510R-3 of the right image, the main viewpoint of which is moved and aligned, and may reconstruct the sub-images by filtering the generated sub-images.
Also, the stereoscopic image playing device may reflect a direction and an area in which the user's sight line is viewed, may reconstruct the sub-images by cutting (cropping) or combining (adding)) the sub-images 510L-1, 510L-2, 510L-3, 510R-1, 510R-2, 510R-3 in which the main viewpoint is moved and aligned, or may also reconstruct the sub-images by cutting (cropping) or combining (adding) the sub-images generated through interpolation and filtering of at least one sub-image of the virtual viewpoint.
In step S705, a stereoscopic image including the sub-image reconstructed by the foregoing operation may be output.
In this way, by reconstructing the sub-image by reflecting the user interaction information and configuring and outputting the stereoscopic image including the reconstructed sub-image, the focus of the object included in the image can be adjusted or the output area of the image can be adjusted for the stereoscopic image.
According to the present application, in an apparatus for displaying a stereoscopic image, particularly, a personal-type realistic display apparatus such as an HMD, a stereoscopic image which can realize continuous parallax and focus variability can be provided with a small amount of arithmetic processing, and a virtual reality service which is effective for a user to put into practice and causes little viewing fatigue can be provided.
The exemplary method of the present application is described in the series of operations for clarity of description, but this is not intended to limit the order in which the steps are performed, and the steps may be performed simultaneously or in a different order as desired. In order to implement the method of the present application, other steps may be added to the illustrated steps, or a part of the steps may be removed and the remaining steps may be included, or a part of the steps may be removed and other steps may be added.
The various embodiments of the present application, not all of which are listed, are intended to illustrate representative aspects of the present application, and the items described in the various embodiments may be applied individually or in combinations of two or more.
Also, various embodiments of the present application may be implemented based on hardware, firmware, software, or a combination thereof. When implemented based on hardware, may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), general purpose processors, controllers, micro-controllers, microprocessors, and the like.
The scope of the present application includes: software or machine executable commands (e.g., operating systems, application programs, firmware, programs, etc.) that enable operations of methods according to various embodiments to be executed on a device or computer, as well as non-transitory computer readable media having such software or commands, etc., stored thereon for execution on a device or computer.

Claims (12)

1. A stereoscopic image playing method includes:
a step of outputting at least one image including a plurality of sub-images and forming a focus based on a combination of the plurality of sub-images;
confirming user interaction information of a user viewing the at least one image;
a step of controlling the plurality of sub-images included in the at least one image based on the user interaction information, thereby correcting the focus; and
a step of outputting the at least one image with the focus corrected,
the plurality of sub-images includes a plurality of left sub-images that are present in the left image and a plurality of right sub-images that are present in the right image,
the step of confirming the user interaction information comprises:
a step of confirming a direction in which the user's sight line is viewed and a range of an area in which the user's sight line is viewed,
the step of correcting the focus comprises:
a step of determining a main viewpoint based on a direction in which the sight line of the user is viewed and a range of an area in which the sight line of the user is viewed;
a step of moving lines of the plurality of left sub-images or the plurality of right sub-images to be aligned with the main viewpoint;
cutting or combining the plurality of left sub-images or the plurality of right sub-images after the lines are moved; and
a step of composing the at least one image comprising the plurality of left sub-images or the plurality of right sub-images cut or combined.
2. The stereoscopic image playback method as claimed in claim 1,
confirming the user interaction information, comprising:
confirming information indicating a head position of the user and a head direction of the user; and
a step of confirming a direction in which the line of sight of the user is viewed and a range of an area in which the line of sight of the user is viewed, corresponding to information indicating the head position of the user and the head direction of the user.
3. The stereoscopic image playback method as claimed in claim 1,
confirming the user interaction information, comprising:
confirming information indicating the pupil position, the pupil size, and the pupil direction of the user; and
a step of confirming a direction in which the line of sight of the user is viewed and a range of an area in which the line of sight of the user is viewed, which correspond to information indicating a pupil position, a pupil size, and a pupil direction of the user.
4. The stereoscopic image playback method as claimed in claim 1,
the step of correcting the focus information of the at least one image comprises:
confirming a relationship between the line of sight of the user and an object of the sub-image;
a step of detecting the sub-image corresponding to a direction in which the user's sight line is viewed and a range of an area in which the user's sight line is viewed;
a step of adjusting the sub-image to match a direction in which the user's sight line is viewed and an area in which the user's sight line is viewed; and
a step of composing the at least one image with the adjusted sub-image.
5. The stereoscopic image playback method as claimed in claim 1,
the step of correcting focus information of the at least one image comprises:
a step of detecting the subimage corresponding to the direction in which the user's sight line is viewed and the range of the area in which the user's sight line is viewed, and cutting a part of the area included in the detected subimage to match the direction in which the user's sight line is viewed and the area in which the user's sight line is viewed; and
a step of composing said at least one image having a partial area contained in said sub-image.
6. The stereoscopic image playback method as claimed in claim 1,
the at least one image includes at least one left image and a right image.
7. A stereoscopic image playback apparatus comprising:
a stereoscopic image display section for outputting a stereoscopic image having at least one image forming a focus based on a combination of a plurality of sub-images;
a user interaction information confirming unit for confirming user interaction information of a user viewing the stereoscopic image; and
a focus control section that controls the plurality of sub-images based on the user interaction information so as to correct the focus,
the plurality of sub-images includes a plurality of left sub-images that are present in the left image and a plurality of right sub-images that are present in the right image,
the user interaction information confirmation part confirms a direction in which the line of sight of the user is viewed and a range of an area in which the line of sight of the user is viewed,
the focus control unit determines a main viewpoint based on a direction in which the user's sight line is viewed and a range of an area in which the user's sight line is viewed,
the focus control part moves lines of the plurality of left sub-images or the plurality of right sub-images to be aligned with the main viewpoint;
the focus control part cuts or combines the left sub-images or the right sub-images after the lines are moved,
the focus control section composes the at least one image including the plurality of left sub-images or the plurality of right sub-images cut or combined.
8. The stereoscopic image playback apparatus as recited in claim 7,
the user interaction information confirmation section may confirm the user interaction information,
confirming information indicating a head position of the user and a head direction of the user,
confirming a direction in which the line of sight of the user is viewed and an area in which the line of sight of the user is viewed, corresponding to information indicating the head position of the user and the head direction of the user.
9. The stereoscopic image playback apparatus as recited in claim 7,
the user interaction information confirmation section may confirm the user interaction information,
confirming information indicating a pupil position, a pupil size, and a pupil direction of the user,
confirming a direction in which the user's sight line is viewed and a range of an area in which the user's sight line is viewed, corresponding to information indicating the user's pupil position, pupil size, and pupil direction.
10. The stereoscopic image playback apparatus as recited in claim 7,
the focus control section is configured to control the focus of the image,
confirming a relationship between the line of sight of the user and the object of the sub-image,
detecting the sub-images corresponding to a direction in which the user's sight line is viewed and a range of an area in which the user's sight line is viewed,
adjusting the sub-images to match a direction in which the user's gaze is viewed and an area in which the user's gaze is viewed,
forming the at least one image with the adjusted sub-images.
11. The stereoscopic image playback apparatus as recited in claim 7,
the focus control section is configured to control the focus of the image,
detecting the subimage corresponding to the direction in which the user's sight line is viewed and the range of the area in which the user's sight line is viewed, cutting a part of the area included in the detected subimage to match the direction in which the user's sight line is viewed and the area in which the user's sight line is viewed,
forming the at least one image having a partial area included in the sub-image.
12. The stereoscopic image playback apparatus as recited in claim 7,
the at least one image includes at least one left image and a right image.
CN201810156558.7A 2017-03-24 2018-02-24 Method and device for playing stereoscopic image by reflecting user interaction information Expired - Fee Related CN108632600B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2017-0037894 2017-03-24
KR1020170037894A KR102306775B1 (en) 2017-03-24 2017-03-24 Method and apparatus for displaying a 3-dimensional image adapting user interaction information

Publications (2)

Publication Number Publication Date
CN108632600A CN108632600A (en) 2018-10-09
CN108632600B true CN108632600B (en) 2021-05-18

Family

ID=63706065

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810156558.7A Expired - Fee Related CN108632600B (en) 2017-03-24 2018-02-24 Method and device for playing stereoscopic image by reflecting user interaction information

Country Status (2)

Country Link
KR (1) KR102306775B1 (en)
CN (1) CN108632600B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102334503B1 (en) * 2019-04-25 2021-12-13 주식회사 와일드비전 Apparatus and method for reproducing multi-focused image

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2395764B1 (en) * 2010-06-14 2016-02-17 Nintendo Co., Ltd. Storage medium having stored therein stereoscopic image display program, stereoscopic image display device, stereoscopic image display system, and stereoscopic image display method
TWI496452B (en) * 2011-07-29 2015-08-11 Wistron Corp Stereoscopic image system, stereoscopic image generating method, stereoscopic image adjusting apparatus and method thereof
US9426447B2 (en) * 2012-10-09 2016-08-23 Electronics And Telecommunications Research Institute Apparatus and method for eye tracking
US9137524B2 (en) * 2012-11-27 2015-09-15 Qualcomm Incorporated System and method for generating 3-D plenoptic video images
WO2016115870A1 (en) * 2015-01-21 2016-07-28 成都理想境界科技有限公司 Binocular ar head-mounted display device and information displaying method therefor
US10136124B2 (en) * 2015-04-27 2018-11-20 Microsoft Technology Licensing, Llc Stereoscopic display of objects

Also Published As

Publication number Publication date
CN108632600A (en) 2018-10-09
KR20180108314A (en) 2018-10-04
KR102306775B1 (en) 2021-09-29

Similar Documents

Publication Publication Date Title
US7440004B2 (en) 3-D imaging arrangements
EP1967016B1 (en) 3d image display method and apparatus
JP5299111B2 (en) Image processing apparatus, image processing method, and program
EP1836859A1 (en) Automatic conversion from monoscopic video to stereoscopic video
KR20120030005A (en) Image processing device and method, and stereoscopic image display device
CA3086592A1 (en) Viewer-adjusted stereoscopic image display
US20230276041A1 (en) Combining vr or ar with autostereoscopic usage in the same display device
KR101821141B1 (en) 3d imaging system and imaging display method for the same
KR100751290B1 (en) Image system for head mounted display
CN108632600B (en) Method and device for playing stereoscopic image by reflecting user interaction information
KR20170043791A (en) Image format and apparatus for 360 virtual reality services
TW201733351A (en) Three-dimensional auto-focusing method and the system thereof
CN108702499A (en) The stereopsis display device of bidimensional image
CA2861212A1 (en) Image processing device and method, and program
CN101908233A (en) Method and system for producing plural viewpoint picture for three-dimensional image reconstruction
EP1875306A2 (en) Depth illusion digital imaging
US20060152580A1 (en) Auto-stereoscopic volumetric imaging system and method
WO2017085803A1 (en) Video display device and video display method
KR102242923B1 (en) Alignment device for stereoscopic camera and method thereof
KR20160003355A (en) Method and system for processing 3-dimensional image
Watt et al. 3D media and the human visual system
CN104432922A (en) Helmet suitable for virtual reality
CN111684517B (en) Viewer adjusted stereoscopic image display
US20160103330A1 (en) System and method for adjusting parallax in three-dimensional stereoscopic image representation
Rhee et al. Stereoscopic view synthesis by view morphing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20210518

CF01 Termination of patent right due to non-payment of annual fee