CN115220240B - Method for generating stereoscopic image data adapting to eye positions and display system - Google Patents

Method for generating stereoscopic image data adapting to eye positions and display system Download PDF

Info

Publication number
CN115220240B
CN115220240B CN202110415722.3A CN202110415722A CN115220240B CN 115220240 B CN115220240 B CN 115220240B CN 202110415722 A CN202110415722 A CN 202110415722A CN 115220240 B CN115220240 B CN 115220240B
Authority
CN
China
Prior art keywords
stereoscopic image
image data
unit
stereoscopic
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110415722.3A
Other languages
Chinese (zh)
Other versions
CN115220240A (en
Inventor
杨钧翔
丁志宏
张凯杰
侯昕佑
施智维
陈韦安
陈冠宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mirage Start Co ltd
Original Assignee
Mirage Start Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mirage Start Co ltd filed Critical Mirage Start Co ltd
Priority to CN202110415722.3A priority Critical patent/CN115220240B/en
Publication of CN115220240A publication Critical patent/CN115220240A/en
Application granted granted Critical
Publication of CN115220240B publication Critical patent/CN115220240B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/26Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type
    • G02B30/27Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving lenticular arrays

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Stereoscopic And Panoramic Photography (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

The application provides a method for generating stereoscopic image data adapting to eye positions and a display system, which aim to generate stereoscopic images displayed in a stereoscopic space.

Description

Method for generating stereoscopic image data adapting to eye positions and display system
Technical Field
The present application relates to a stereoscopic image display technology, and more particularly to a stereoscopic image data generation method and display system adapted to the eye position, which can adjust stereoscopic image data according to the eye position.
Background
The conventional method for displaying stereoscopic images mostly uses the principle that parallax is generated when two eyes see the same object, so that stereoscopic patterns of two images with differences can be respectively seen by the two eyes, or dynamic images of two different images are sequentially played, and are usually watched through special glasses, such as red-blue glasses, polarized glasses or shutter glasses, so that stereoscopic visual effects with depth are fused in the brain due to the parallax of the two eyes.
In addition, the method is to provide a stereoscopic display device, through an optical component in the display device, under the condition of not using special glasses, an observer can respectively receive images with image differences at specific viewing angles, and stereoscopic visual effects with depth can also be generated.
The prior art provides a stereoscopic image without reference to the viewing position of the user, and even requires the viewing position of the user, and does not provide any solution.
Disclosure of Invention
In order to improve a stereoscopic image display method which does not consider eye positions originally according to the prior art, the application discloses a stereoscopic image data generation method and a display system which adapt to the eye positions.
According to an embodiment, the display system includes a multi-optical module, which is composed of a plurality of lens units, for presenting a stereoscopic image; the display system also comprises a display panel for displaying an integrated image, wherein the integrated image is composed of a plurality of unit images, and the three-dimensional image can be displayed through the multi-optical assembly module after the integrated image is displayed; the display system is provided with a display driving unit for driving the display unit to display the integrated image; and an image processing unit in which a generation method of stereoscopic image data adapted to eye positions is performed to generate the integrated image, wherein stereoscopic image data processed to describe three-dimensional spatial information of stereoscopic images forms an integrated image of an eye position.
The integrated image is formed by first obtaining stereoscopic image data describing the stereoscopic image, and obtaining the position of eyes of a user through detection, and obtaining the physical characteristics of a display panel and a multi-optical assembly module in a display system. Then, a visible range can be formed according to the eye position and the ray tracing information between each lens unit in the multi-optical assembly module, so that the unit image can be recorded according to the part covered by the visible range corresponding to each lens unit and the physical characteristics of the display panel and the multi-optical assembly module, and the display pixel position of the display panel is compared, so that a plurality of unit images corresponding to a plurality of lens units are generated, and the plurality of unit images form the integrated image for recording the stereoscopic image data suitable for the eye position.
Preferably, the stereoscopic image data further describes color information describing a stereoscopic image, and the three-dimensional space information includes coordinate values and chromaticity values for displaying each pixel in a stereoscopic space. The physical characteristics of the display panel and the multi-optical assembly module include the coordinates, the size and the refractive index of each lens unit in the multi-optical assembly module, and the spatial relationship between each lens unit and the display panel.
Further, the unit images corresponding to the different lens units have position and size differences due to the difference of the ray trace, and the formed integrated image is displayed through the display panel and projected by the multi-optical component module to reproduce the stereoscopic image adapting to the eye position.
Further, when the stereoscopic image data is a dynamic stereoscopic image, the method for generating stereoscopic image data adapted to the eye positions is repeated to generate a plurality of continuous integrated images adapted to the eye positions for reproducing the dynamic stereoscopic image through the multi-optical module.
For a further understanding of the nature and the technical aspects of the present application, reference should be made to the following detailed description of the application and to the accompanying drawings, which are provided for purposes of reference only and are not intended to limit the application.
Drawings
FIG. 1 is a schematic view showing one embodiment of a display device in a stereoscopic image display system;
FIG. 2 shows a second schematic view of an embodiment of a display device in a stereoscopic image display system;
FIG. 3 is a block diagram of a circuit embodiment of a stereoscopic image display system;
FIG. 4 is a diagram showing an example of forming a unit image in a ray tracing manner according to an eye position in a method for generating stereoscopic image data adapted to the eye position;
FIGS. 5A and 5B show a second example of forming a unit image by ray tracing in a method for generating stereoscopic image data adapted to eye positions;
FIG. 6 is a flowchart showing an embodiment of forming an integrated image in a method of generating stereoscopic image data adapted to eye positions; and
fig. 7 is a flowchart showing an embodiment of a method for generating stereoscopic image data adapted to eye positions.
Detailed Description
The application discloses a method for generating stereoscopic image data adapting to eye positions and a display system, wherein the disclosed method is suitable for a display device provided with a plurality of optical components and used for displaying stereoscopic images, and an embodiment of the display device can be shown by referring to an embodiment shown in fig. 1.
The figure shows a schematic structure of a display device in a stereoscopic image display system, wherein the stereoscopic display device may be a liquid crystal display panel (LCD) with a backlight module (not shown in the figure), other display forms with backlight modules are not excluded, or may be an Organic Light Emitting Diode (OLED) with self-luminous characteristics. The display image 11 displayed on the display panel 1 is an integrated image (integral image) generated by the method flow shown in fig. 6, where the integrated image is a plane image composed of a plurality of unit images (element images), and the unit images are images of the lens groups of the multi-optical component module 2 composed of a plurality of lens groups according to the embodiment shown in the figure, where each unit image on the display image 11 may be an image corresponding to each lens group position in a one-to-one, one-to-many, many-to-many, and the like manner.
In the multi-optical component module 2, a base 21 and a lens 22 are provided, each optical component on the lens 22 may be a lens group, the lens group may be composed of one or more convex lenses and concave lenses, and the multi-optical components form a lens matrix. The system presents a stereoscopic image through the multi-optical module 2, however, under this technical concept, the position of the viewing position 5 and the angle relative to the display device will affect the formation of the integrated image and the unit image. The image processing unit 12 of the display device in the figure can execute the method for forming the integrated image in addition to the general image processing procedure, and can adjust the reference image, the algorithm unit image and form the integrated image according to the viewing position of the viewing position 5, the stereoscopic image display position, the physical characteristics of each lens group in the multi-optical component module 2 and the spatial relationship between each component. According to the stereoscopic image display embodiment adapted to the eye position, if the user changes the viewing position 5, the user can adaptively provide the appropriate viewing content at the viewing position 5 thereof.
The display device shown in the figure can be an electronic device with a plane screen such as a mobile phone, a tablet, a computer and the like, the display panel 1 is arranged at the lower layer and is responsible for displaying a plane image which is not subjected to light reproduction, and mainly is a display component for displaying an integrated image; the multi-optical module 2 is arranged on the upper layer, has the effect of regulating and controlling the light field, and can regulate and control the light angle of the stereoscopic image, so that the planar image which is not recombined originally can be redistributed and combined. In this embodiment, the integrated image is subjected to light redistribution and combination by the multi-optical module 2, so as to display a recombined stereoscopic image.
The multi-optical module 2 is composed of a plurality of lens groups arranged in an array form, and forms a lens portion 22, wherein the physical characteristics of the lens portion include refractive index and penetrability caused by materials and lens curvature, and the number and arrangement of the lens groups of the lens matrix, in combination with the arrangement of the display panel 1, can determine the three-dimensional image content such as height, visual angle range, definition and the like of the stereoscopic image.
According to various embodiments, each lens group may be a single lens (single lens), a lens array (lens array), a biconvex lens (lenticular lens), a Fresnel lens (Fresnel lens), and may be combined with a pinhole (pin hole), a pinhole array (pin hole array), a light barrier (barrier), and a specific point light source (point light source) for imaging. The display device for displaying stereoscopic images can adopt a display with one lens group or a display array with more than two lens groups, and displays images through a display panel and images on a default position through lenses.
Fig. 2 is another schematic diagram illustrating an embodiment of a display device in a stereoscopic display system, in which an integrated image composed of unit images is displayed by a display panel 1 in a stereoscopic display device, and stereoscopic images are reproduced through a multi-optical device module 2.
As in the embodiment shown in the figure, also seen in fig. 1, the user sees a "3D" floating stereoscopic image from its viewing position 5, and this stereoscopic image is displayed by the display panel 1 as a display image 11, which is an integrated image formed by a plurality of unit images, each of which corresponds to a single optical element in a multi-optical element module 2, i.e. a lens group.
Since each lens group of the lens portion 22 is disposed at a different position, when the multi-optical module 2 is to project a floating stereoscopic image, and is a stereoscopic image to be seen at a certain viewing position 5, it is obvious that the image projected by the lens group at each position is to be projected onto a predetermined spatial position, and therefore the lens groups at each position should project different images, that is, the unit images corresponding to each optical module will have a difference from each other.
For example, when a floating stereoscopic image is to be projected, the optical element located to the left of the projected stereoscopic image should project a unit image at a certain projection angle to the left of the heavier stereoscopic image; the optical component positioned on the right side of the stereoscopic image is required to project a unit image with a certain projection angle on the right side of the heavier stereoscopic image; the optical element below the stereoscopic image is intended to project the unit image below the heavier stereoscopic image and upward and to project the unit image to the human eye as a real image. In addition, the floating stereoscopic image is shown at a distance from the display plane, and the display effect is as if floating in the air, but in other embodiments, the sinking effect in the display plane is not excluded.
The stereoscopic image display system described in the above embodiments may be implemented by a circuit system, and the embodiment may refer to the circuit block diagram shown in fig. 3.
The stereoscopic image display system can be implemented in hardware and software to generate stereoscopic image data adapted to eye positions, wherein a part of the hardware is a display device, and the display device includes a circuit unit electrically connected to each other as shown in the figure, and the main component includes a multi-optical component module 301 composed of a plurality of single optical components for presenting a stereoscopic image, and each optical component is a lens group, and the lens group may be composed of one or more convex lenses and concave lenses, and the multi-optical component forms a lens matrix as shown in the embodiment of fig. 1 or fig. 2. The display system includes a display unit 303, including a display panel, for displaying an integrated image, which can be projected by the multi-optical module 301 to display a stereoscopic image.
The display system includes a display driving unit 305, which may be a driving circuit of a display panel, capable of generating an image control signal to drive the display unit 303 to display an integrated image. The display system includes an image processing unit 307, which may be an image processing integrated circuit, such as a digital signal processor (ISP) or a specific software implemented module, for generating stereoscopic images, a storage unit 311 connected to store stereoscopic image data received via the input interface unit 309, and a memory unit 311, such as a system memory, for temporarily storing image data, system operation instructions and operation instructions, capable of providing an instruction set for operation and related image data, which may be used as a buffer for temporarily storing files generated during system operation.
The image processing unit 307 is further electrically connected to an eye detection unit 313 for detecting the eye position of the user, where the eye detection unit 313 may be a circuit module or an independent device disposed near the display unit 303 and the multi-optical module 301, and the eye position may be one or both of the eye positions of the user, which may be described by a set of coordinates, or may be converted into coordinates with the same coordinate system as the stereoscopic display device by obtaining the face image of the user through software (image processing) and hardware (camera) means and then detecting the eye position according to the eye characteristics. When the eye positions are found, the image processing unit 307 is provided to acquire the related stereoscopic image data of the stereoscopic image adapted to the eye positions from the storage unit 311. On the other hand, when the stereoscopic image display system adapted to the eye position records the image data of the stereoscopic space, the eye position of the user is obtained by a software method or by the eye detection unit 313, and then the unit image corresponding to each lens unit is calculated according to the eye position, the stereoscopic image to be displayed, and the information such as each lens unit in the multi-optical module 301, and then stored in the storage unit 311.
The image processing unit 307, the input interface unit 309, the storage unit 311 and the eye detection unit 313 in the stereoscopic image display system realize the operation circuit 300 for generating stereoscopic image data adapted to the eye positions, the operation circuit 300 is connected to the external stereoscopic image source 30 through the input interface unit 309, and when the image processing unit 307 operates the method for generating stereoscopic image data adapted to the eye positions, stereoscopic image data describing three-dimensional space information of stereoscopic images is received through the input interface unit 309. The stereoscopic image source 30 may be a stereoscopic image drawn by specific software and hardware, wherein stereoscopic coordinates, chromaticity and other information of the stereoscopic image are recorded, and may include color information of the stereoscopic image and three-dimensional space information, and may be a two-dimensional plane image and a depth map.
Then, a spatial relative relationship is established according to the stereoscopic image data, and a reference image can actually reflect the spatial relative relationship, and the reference image can reflect the stereoscopic image displayed finally, and the reference image is set by a user for setting the stereoscopic image to be displayed, then, the system calculates the unit image corresponding to each optical component according to the eye position and the physical information related to the reference image and the multi-optical component module 301, and forms an integrated image corresponding to the plurality of unit images of the multi-optical component for displaying by the display unit 303, and the integrated image is displayed by the display driving unit 305, and then, the stereoscopic image is displayed by the multi-optical component module 301.
The physical information related to the multi-optical component module mainly relates to physical characteristics of each optical component, such as each lens unit forming a lens array, and related physical characteristics such as optical characteristics (such as refractive index) of each lens unit, arrangement position, adjacent spacing between lens units and the like, and at least comprises spatial relationship between a spatial position of a projected stereoscopic image and each optical component, such as distance and relative angle of the stereoscopic image from each optical component (such as lens group); and the spatial relationship of each optical element to a display panel in the display unit 303, such as the spacing of each optical element from the display panel.
The above-mentioned spatial relationship can be obtained by placing the system in the same spatial coordinate system, calculating the distance and relative angle between the stereoscopic image and each optical component by using the spatial coordinates of the stereoscopic image and the relative coordinates of each optical component, and the relative positions of the optical components can be obtained, and the distance between each optical component and the display panel can be obtained. The spatial relationship may also include the relative position of each optical element on the multi-optical element module, as well as the distance from the display panel and the pixel size. Then, the stereo image data to be displayed can be input in the generating method of the stereo image data adapting to the eye positions according to the various spatial relations, including setting the oblique angle of the stereo image display according to the watching positions of users, forming ray tracking (ray tracking) according to the eye positions, forming unit images through calculation, and forming integrated images adapting to the eye positions after processing.
FIG. 4 is a diagram showing an example of forming a unit image in a ray trace manner according to eye positions in a method for generating stereoscopic image data adapted to eye positions, wherein the unit image and the integrated image are formed as described with reference to the flowchart of the embodiment shown in FIG. 6.
The illustration schematically shows a user (eyes 42) viewing a stereoscopic image 410 over a stereoscopic image display device, the stereoscopic image 410 projecting the image displayed on the display panel 40 into a stereoscopic space through lens groups in the form of arrays in multi-optical component modules in the display device, presenting a floating stereoscopic image 410 (in this example displaying the stereoscopic word "3D"). One of the techniques is to determine a stereoscopic image to be presented, for example, to acquire stereoscopic image data from a stereoscopic image source (step S601, fig. 6), and to create a corresponding reference image, that is, a stereoscopic image 410 in the schematic diagram of fig. 4 (step S603, fig. 6). The stereoscopic image information of the reference image may include color information describing the stereoscopic image 410 and three-dimensional space information, and each image pixel in the space includes a set of coordinate values (x, y, z) and chroma values (chroma values), which may be a planar image data and its depth map. Then, an eye position may be set, or the eye position of the real user may be obtained by detecting the eye position, and various physical information between the eye position and the multi-optical module may be obtained (step S605). The system obtains physical information related to the multiple optical components, including the size and characteristics of the optical components (such as lens groups), including the set coordinates, size, refractive index, etc. of the single lens group and the multiple lens matrix, and the spatial position of the projection of the optical components, the spatial relationship (such as distance) between each optical component and the display panel, the spatial relationship between the projection position and each optical component, etc.
In this regard, when the stereoscopic image is reproduced, the plane coordinate values may be matched with the depth value (z value) of each pixel in the plane image recorded by the depth map, so that coordinate values (x, y, z) describing the stereoscopic image when the image is reproduced are added with the chromaticity values to display the correct spatial position and color of each portion in the image, thereby generating the stereoscopic image. And then, the system establishes the reference image according to the received stereoscopic image data, the eye position of the user and the stereoscopic image projection position. The reference image is used for reflecting the three-dimensional coordinate value and the chromaticity value of the display image. According to one embodiment, the input original stereoscopic image is transformed into a reference image by a coordinate transformation, wherein a set of transformation parameters is calculated according to a coordinate transformation algorithm.
When the eye position is acquired, ray trace information between the eye position and each lens unit is established. As shown, to create ray tracing information between the eye 42, each lens unit (411, 412) and the corresponding unit image (421, 422), according to one embodiment, a visual range (Region of Visibility, roV) is formed by the edge position of the eye 42, the edge position of each lens unit in the multi-optical assembly module, a portion of the stereoscopic image 410 that each visual range may cover, for example, fig. 4, the result of ray tracing between the eye 42 and the first lens unit 411 forms a first visual range 421, the result of ray tracing between the eye 42 and the second lens unit 412 forms a second visual range 422, and so on, the eye 42 may form multiple visual ranges with each lens unit in the multi-optical assembly module, each visual range covering a portion of the stereoscopic image 410, i.e., corresponding to the multiple unit images (401, 402) having different calculated contents and sizes.
In this example, according to the obtained first and second visual ranges 421 and 422, and considering the optical characteristics of the corresponding first and second lens units 411 and 412, such as lens thickness, area, surface curvature, refractive index, and other lens specifications, and the distance from the display panel, the corresponding first and second unit images 401 and 402 can be calculated, and finally the image information of the stereoscopic space is recorded according to the plurality of ray trace information, the plurality of lens unit information, and the reference image, wherein the image information of the stereoscopic image 410 is recorded by comparing the display pixel position of the display panel with the image information of the stereoscopic image 410 covered by each visual range in the stereoscopic space (step S607), specifically, the pixel information (three-dimensional coordinates and chromaticity values) of the stereoscopic image 410 recorded in the stereoscopic space by the memory is projected on each pixel value of the display panel by the multi-optical component module.
In an embodiment, the method for generating stereoscopic image data establishes a coordinate transformation function between original stereoscopic image information and a reference image, and the algorithm calculates the reference image as a unit image corresponding to each optical element (e.g. each lens) according to the characteristics of hardware, such as physical information of each lens element and the display panel, and the coordinate transformation function, and the unit image corresponding to each lens element has a position and a size difference due to different ray tracing (step S609). A plurality of unit images corresponding to a plurality of lens units in the multi-optical module can be formed on the display panel, and the unit images form an integrated image for recording the stereoscopic image 410 adapted to the eye position (step S611), and then the stereoscopic image 410 adapted to the eye position can be reproduced according to the eye 42 position.
The lens units at different positions are also shown in fig. 4 to correspond to unit images of different positions, sizes and contents, and when the position of the eye 42 is changed, the unit image calculated by the method for generating stereoscopic image data adapted to the eye position described in fig. 6 is different from the final integrated image because the result of the ray tracing is changed. FIGS. 5A and 5B are diagrams illustrating another embodiment of the method for forming a unit image in a ray tracing manner.
Fig. 5A and 5B schematically show a graph, in which each component in the X or Y axis and Z axis directions in the coordinates is displayed based on a stereoscopic image display system, and one lens unit 50 of the multi-optical component module is located near the origin (0), which may be a single lens of the multi-optical component module in the stereoscopic image display system or a lens group in the stereoscopic image display apparatus as shown in fig. 1 or 2, and represents the description of the light trace formed with the edge of the eye 52. It is noted that the eye 52 may represent one or both eyes of the user, and two sets of integrated images may be calculated for each eye.
In fig. 5A, a principle of ray tracing is shown, a visual Range (ROV) formed by each lens (edge) with respect to the position of the eye 52 (edge) is determined according to the position of the eye 52 and the position of each lens unit (in this example, the lens unit 50), that is, a visual range formed between the first ray 501 and the second ray 502 in the figure, and then, according to the physical characteristics (such as refractive index, lens thickness, etc.) of the lens unit 50, stereoscopic image data (which is not shown in the figure, for example, a stereoscopic image floating above the display panel and is described in a stereoscopic coordinate system) may be introduced, a portion of the stereoscopic image covered by each visual range is determined according to the obtained visual range and the stereoscopic image data, and a display position and a display range of each unit image are determined in consideration of the eye position, the spatial relationship between the display panel and each lens unit, that is a unit image 500 corresponding to the lens unit 50 is formed at a dotted line (a position of Z-axis coordinate 0) in the display panel position as shown in the figure. Fig. 5B further shows a coordinate illustration of the visual range formed by the first ray 501 and the second ray 502 between the lens unit 50 and the eye 52. Similarly, according to the ray tracing principle, an individual visual range can be established according to the eye 52 and other lens units of other multi-optical component modules, and corresponding individual unit images can be generated according to the stereoscopic coordinate range covered by the individual visual range of the reference image and the chromaticity value of the corresponding pixel.
It is noted that reference to the position of the display panel for each optical element (including one-to-one, one-to-many, or many-to-one) will be referred to in generating the reference image, and typically does not necessarily refer to the user viewing position. However, since the user may view the stereoscopic image obliquely, by the method for generating stereoscopic image data adapted to the eye positions according to the disclosure, for the multi-optical module including the lens groups arranged in an array, each calculated unit image will change according to the eye positions, and the unit images can be considered to be converged into stereoscopic images again when light passes out of and through the multi-optical module, even floating or sinking in various physical information of the stereoscopic image above, below or in front of and behind the display device, the reference images are set in accordance with different angles in the algorithm, so that the finally generated unit images and the integrated images are different.
Next, as shown in the flowchart of the embodiment of displaying stereoscopic images shown in fig. 7, the relative positions of the stereoscopic images and the display device are not limited, and the inventive concept may also be applied to an embodiment in which two (or more) sets of display devices display the same stereoscopic image or stereoscopic images.
In the foregoing embodiment, in step S701, the eye positions are detected according to the face and eye characteristics, and then, in step S703, the integrated image recorded with the unit images corresponding to the lens units is obtained from the memory in the display system. Then, in step S705, the integrated image is input to a display driving unit in the display system, which is a driving circuit for driving the display unit to display the image, and the integrated image is displayed, and in step S707, the corresponding unit image is projected onto each optical component through the multi-optical component module, and the stereoscopic image adapted to the eye position is projected onto a certain spatial position on the multi-optical component module.
In the imaging process, the stereoscopic image may be displayed on a display plane formed by the multiple optical components in the display device, or in front of and behind, if the stereoscopic image data relates to a dynamic stereoscopic image, it is necessary to create multiple continuous reference images reflecting the spatial relative relationship in the memory and calculated unit images adapting to the eye positions in advance, then output multiple integrated images, and then reproduce the dynamic stereoscopic image through the multiple optical component module according to the flow described in fig. 7.
In summary, the method and the display system for generating stereoscopic image data adapted to eye positions described in the above embodiments are different from the conventional stereoscopic image display method without considering eye positions, and the system provided by the present application provides a ray tracing means for detecting eye positions of a user and an image processing means for correcting unit images according to eye positions of the user, wherein the method can provide stereoscopic image data adapted to eye positions according to detected user viewing positions, so as to solve the problem of poor stereoscopic image viewing caused by poor user viewing positions.
The above disclosure is only of the preferred embodiments of the present application and is not intended to limit the scope of the claims, so that all equivalent technical changes made by the application of the specification and drawings are included in the scope of the claims.

Claims (8)

1. A method for generating stereoscopic image data adapted to eye positions, the method comprising:
acquiring stereoscopic image data, wherein three-dimensional space information describing a stereoscopic image is recorded;
obtaining the position of eyes of a user and the physical characteristics of a display panel and a multi-optical component module in a display system, wherein the physical characteristics of the display panel and the multi-optical component module comprise the coordinates and the lens specification of each lens unit in the multi-optical component module, the spatial relationship between each lens unit and the display panel and the difference of positions and sizes of unit images of different lens units due to the difference of ray tracing;
forming a visual range according to the eye position and the ray trace information between each lens unit in the multi-optical assembly module; and
calculating the unit images displayed on the display panel according to the part covered by the visible range corresponding to each lens unit and the physical characteristics of the display panel and the multi-optical assembly module so as to form a plurality of unit images corresponding to a plurality of lens units in the multi-optical assembly module, wherein the plurality of unit images form an integrated image for recording the stereoscopic image data adapting to the eye positions, and the integrated image is displayed through the display panel and projected through the multi-optical assembly module so as to reproduce the stereoscopic image adapting to the eye positions.
2. The method of generating stereoscopic image data according to claim 1, wherein the stereoscopic image data further describes color information describing the stereoscopic image, and the three-dimensional space information includes coordinate values and chromaticity values of each pixel displayed in a stereoscopic space.
3. The method of claim 1, wherein when the stereoscopic image data is a dynamic stereoscopic image, repeating the method of generating stereoscopic image data adapted to the eye position generates a plurality of continuous integrated images adapted to the eye position for reproducing the dynamic stereoscopic image through the multi-optical module.
4. A method of generating stereoscopic image data according to any one of claims 1 to 3, wherein when the stereoscopic image data describing the stereoscopic image is obtained, a spatial relative relationship is established according to the stereoscopic image data, and a reference image is established to reflect the spatial relative relationship, the reference image being set by a user to reflect the stereoscopic image finally displayed by the display system.
5. A display system, the system comprising:
the multi-optical assembly module consists of a plurality of lens units and is used for presenting a stereoscopic image;
a display panel for displaying an integrated image composed of a plurality of unit images, the three-dimensional image being presented by the multi-optical module;
a display driving unit for driving the display unit to display the integrated image; and
an image processing unit for processing a stereoscopic image data describing three-dimensional spatial information of the stereoscopic image to form the integrated image adapted to the eye position, wherein the method for forming the integrated image comprises:
obtaining a position of an eye of a user and physical characteristics of the display panel and the multi-optical component module, wherein the physical characteristics of the display panel and the multi-optical component module comprise coordinates and lens specifications of each lens unit in the multi-optical component module, a spatial relationship between each lens unit and the display panel and a difference in position and size of unit images of different lens units due to different ray traces;
forming a visual range according to the eye position and the ray trace information between each lens unit in the multi-optical assembly module; and
calculating the unit images displayed on the display panel according to the part covered by the visible range corresponding to each lens unit and the physical characteristics of the display panel and the multi-optical assembly module so as to form a plurality of unit images corresponding to a plurality of lens units in the multi-optical assembly module, wherein the plurality of unit images form the integrated image for recording the stereoscopic image data adapting to the eye positions, and the integrated image is displayed through the display panel and projected through the multi-optical assembly module so as to reproduce the stereoscopic image adapting to the eye positions.
6. The display system of claim 5, wherein the stereoscopic image data further describes color information describing the stereoscopic image, the three-dimensional space information including coordinate values and chromaticity values of each pixel displayed in a stereoscopic space.
7. The display system of claim 5, wherein when the stereoscopic image data is a dynamic stereoscopic image, repeating the method for generating stereoscopic image data adapted to the eye position generates a plurality of continuous integrated images adapted to the eye position for reproducing the dynamic stereoscopic image through the multi-optical module.
8. The display system according to any one of claims 5 to 7, wherein when the stereoscopic image data describing the stereoscopic image is obtained, a spatial relative relationship is established according to the stereoscopic image data, and a reference image is established to reflect the spatial relative relationship, the reference image being set by a user to reflect the stereoscopic image finally displayed by the display system.
CN202110415722.3A 2021-04-19 2021-04-19 Method for generating stereoscopic image data adapting to eye positions and display system Active CN115220240B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110415722.3A CN115220240B (en) 2021-04-19 2021-04-19 Method for generating stereoscopic image data adapting to eye positions and display system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110415722.3A CN115220240B (en) 2021-04-19 2021-04-19 Method for generating stereoscopic image data adapting to eye positions and display system

Publications (2)

Publication Number Publication Date
CN115220240A CN115220240A (en) 2022-10-21
CN115220240B true CN115220240B (en) 2023-11-21

Family

ID=83605017

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110415722.3A Active CN115220240B (en) 2021-04-19 2021-04-19 Method for generating stereoscopic image data adapting to eye positions and display system

Country Status (1)

Country Link
CN (1) CN115220240B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1175308A (en) * 1994-12-13 1998-03-04 齐诺技术有限公司 Tracking system for stereoscopic display systems
CN1894976A (en) * 2003-12-18 2007-01-10 视真技术有限公司 Multi-user autostereoscopic display with position tracking
TW201021545A (en) * 2008-11-18 2010-06-01 Ind Tech Res Inst Stereoscopic image displaying apparatus
CN103384854A (en) * 2010-12-22 2013-11-06 视瑞尔技术公司 Combined light modulation device for tracking users
KR101741227B1 (en) * 2016-02-25 2017-05-29 주식회사 다이프로 Auto stereoscopic image display device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102005040597A1 (en) * 2005-02-25 2007-02-22 Seereal Technologies Gmbh Method and device for tracking sweet spots

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1175308A (en) * 1994-12-13 1998-03-04 齐诺技术有限公司 Tracking system for stereoscopic display systems
CN1894976A (en) * 2003-12-18 2007-01-10 视真技术有限公司 Multi-user autostereoscopic display with position tracking
TW201021545A (en) * 2008-11-18 2010-06-01 Ind Tech Res Inst Stereoscopic image displaying apparatus
CN103384854A (en) * 2010-12-22 2013-11-06 视瑞尔技术公司 Combined light modulation device for tracking users
KR101741227B1 (en) * 2016-02-25 2017-05-29 주식회사 다이프로 Auto stereoscopic image display device

Also Published As

Publication number Publication date
CN115220240A (en) 2022-10-21

Similar Documents

Publication Publication Date Title
US10502967B2 (en) Method for rendering three-dimensional image, imaging method and system
US8675048B2 (en) Image processing apparatus, image processing method, recording method, and recording medium
US10198865B2 (en) HMD calibration with direct geometric modeling
KR102140080B1 (en) Multi view image display apparatus and controlling method thereof
US8791989B2 (en) Image processing apparatus, image processing method, recording method, and recording medium
US9191661B2 (en) Virtual image display device
US9848184B2 (en) Stereoscopic display system using light field type data
KR102121389B1 (en) Glassless 3d display apparatus and contorl method thereof
US8564647B2 (en) Color management of autostereoscopic 3D displays
CN104427318B (en) Method and device of correcting image-overlapped area
US10466485B2 (en) Head-mounted apparatus, and method thereof for generating 3D image information
CN112585523B (en) Display device with flux calibration
US9123171B1 (en) Enhancing the coupled zone of a stereoscopic display
US20090244267A1 (en) Method and apparatus for rendering virtual see-through scenes on single or tiled displays
CN109782452B (en) Stereoscopic image generation method, imaging method and system
WO2021197370A1 (en) Light field display method and system, storage medium and display panel
US9681122B2 (en) Modifying displayed images in the coupled zone of a stereoscopic display based on user comfort
US10473945B2 (en) Method for tuning a three-dimensional image and a display apparatus thereof
KR20180057294A (en) Method and apparatus of 3d rendering user' eyes
Park et al. 48.2: Light field rendering of multi‐view contents for high density light field 3D display
CN115220240B (en) Method for generating stereoscopic image data adapting to eye positions and display system
TWI771969B (en) Method for rendering data of a three-dimensional image adapted to eye position and a display system
US11172190B2 (en) Stereo weaving for head-tracked autostereoscopic displays
CN110072098B (en) Stereo image adjusting method and display device
TWI842191B (en) Display supporting multiple views, and method for such display

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant