WO2024093893A1 - 空间现实显示方法、空间现实显示系统以及非易失性计算机可读存储介质 - Google Patents

空间现实显示方法、空间现实显示系统以及非易失性计算机可读存储介质 Download PDF

Info

Publication number
WO2024093893A1
WO2024093893A1 PCT/CN2023/127651 CN2023127651W WO2024093893A1 WO 2024093893 A1 WO2024093893 A1 WO 2024093893A1 CN 2023127651 W CN2023127651 W CN 2023127651W WO 2024093893 A1 WO2024093893 A1 WO 2024093893A1
Authority
WO
WIPO (PCT)
Prior art keywords
sub
display module
eye
width
pixel
Prior art date
Application number
PCT/CN2023/127651
Other languages
English (en)
French (fr)
Inventor
马希通
赵星星
段然
Original Assignee
京东方科技集团股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京东方科技集团股份有限公司 filed Critical 京东方科技集团股份有限公司
Publication of WO2024093893A1 publication Critical patent/WO2024093893A1/zh

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/327Calibration thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/363Image reproducers using image projection screens

Definitions

  • the present application relates to the technical field of spatial reality display systems, and in particular to a spatial reality display method and a spatial reality display system, and a non-volatile computer-readable storage medium.
  • Naked-eye 3D display is a general term for technologies that achieve stereoscopic visual effects without the help of external tools such as polarized glasses.
  • Related 3D display technologies adapt the position of the display screen to the viewer's viewing angle, thereby achieving better naked-eye 3D display effects.
  • the embodiments of the present application provide a spatial reality display method and a spatial reality display system, and a non-volatile computer-readable storage medium to solve or alleviate one or more technical problems in the prior art.
  • an embodiment of the present application provides a spatial reality display method, including:
  • the attribute parameters of the periodic unit include the width of the periodic unit and the horizontal coordinate of the boundary of the periodic unit, wherein the width of the periodic unit is the sum of the widths of the adjacent left eye view and the right eye view in the left eye view and the right eye view, and the horizontal coordinate of the boundary of the periodic unit is the horizontal coordinate of the boundary between two adjacent periodic units;
  • the multiple sub-pixels are divided into a first sub-pixel group and a second sub-pixel group, wherein the first sub-pixel group is the sub-pixel that outputs the left eye data stream in the display module, and the second sub-pixel group is the sub-pixel that outputs the right eye data stream in the display module.
  • the embodiments of the present application also provide a spatial reality display system for executing the spatial reality display methods of all the embodiments of the present application.
  • the embodiments of the present application also provide a non-volatile computer-readable storage medium, which, when the program stored thereon is executed by a processor, can execute the spatial reality display method of all embodiments of the present application.
  • a coordinate system is determined based on the parameters of the display module, and the attribute parameters of the periodic units corresponding to the left eye view and the right eye view are determined in real time through the coordinates of the human eye position, the parameters of the display module and the coordinate system.
  • the left eye view and the right eye view of the display module for realizing 3D display are determined based on the attribute parameters of the periodic units corresponding to the left eye view and the right eye view.
  • the first sub-pixel group and the second sub-pixel group can be determined through the parameters of the display module and the attribute parameters of the periodic units corresponding to the left eye view and the right eye view, and the left eye data stream and the right eye data stream are output on the display module through the first sub-pixel group and the second sub-pixel group, thereby realizing the redistribution of pixels in the display module according to the coordinate position of the human eye to adapt to the viewer's eye position.
  • FIG. 1 shows a flow chart of a spatial reality display method according to an embodiment of the present application.
  • FIG. 2 is a schematic diagram showing a method for implementing a spatial reality display according to an embodiment of the present application in a coordinate system.
  • FIG. 3 shows a flow chart of a spatial reality display method according to an embodiment of the present application.
  • FIG. 4 shows a flow chart of a spatial reality display method according to an embodiment of the present application.
  • FIG. 5 shows a flow chart of a spatial reality display method according to an embodiment of the present application.
  • FIG. 6 is a timing diagram of the spatial reality display method of the embodiment in FIG. 5 .
  • FIG. 7 shows a schematic structural diagram of a spatial reality display system according to an embodiment of the present application.
  • FIG. 1 is a flow chart of a spatial reality display method according to an embodiment of the present application. As shown in FIG. 1 , the spatial reality display method includes:
  • S110 Obtain the human eye position, display module parameters, and left eye data stream and right eye data stream.
  • S130 Determine the attribute parameters of the periodic unit corresponding to the left eye view and the right eye view according to the coordinates of the human eye position and the parameters of the display module in the coordinate system, wherein the attribute parameters of the periodic unit include the width of the periodic unit and the horizontal coordinate of the boundary of the periodic unit, wherein the width of the periodic unit is the sum of the widths of adjacent left eye views and right eye views in the left eye view and the right eye view.
  • S140 Determine a first sub-pixel group and a second sub-pixel group based on parameters of a display module, widths of periodic units of a left-eye view and a right-eye view, and horizontal coordinates of boundaries of periodic units, wherein a first sub-pixel in the first sub-pixel group is a sub-pixel that outputs a left-eye data stream in the display module, and a second sub-pixel in the second sub-pixel group is a sub-pixel that outputs a right-eye data stream in the display module.
  • the spatial reality display method of this embodiment can be applied to a display system, and specifically can be directly executed on a naked-eye 3D display module, for example, by executing the spatial reality display method through a processor, a processing chip, etc. in the display module, and displaying it on the display module. It is also possible to control the naked-eye 3D display module to output the left-eye data stream on the first sub-pixel and the right-eye data stream on the second sub-pixel after executing the spatial reality display method through another controller or processor.
  • the display module can be a display screen, a monitor, etc., which is a device for naked-eye 3D display. The following embodiments are illustrated by taking the display system as the execution subject.
  • a coordinate system is established based on the parameters of the display module, and the attribute parameters of the periodic units corresponding to the left eye view and the right eye view are determined in real time (i.e., for each frame of the video) through the coordinates of the human eye position and the physical parameters of the display module in the coordinate system.
  • the attribute parameters of the periodic units corresponding to the figure can determine the left eye view and the right eye view of the display module for realizing 3D display.
  • the first sub-pixel group and the second sub-pixel group can be determined through the parameters of the display module and the attribute parameters of the periodic units corresponding to the left eye view and the right eye view, and the left eye data stream and the right eye data stream are output on the display module respectively through the first sub-pixel group and the second sub-pixel group, that is, the sub-pixels in the display module are redistributed according to the position of the human eye to adapt to the position of the viewer's human eye.
  • the viewer can clearly watch the naked eye 3D display at different angles or different positions without moving the display screen, avoiding the problem of poor naked eye 3D display effect caused by the difficulty in accurately locating the human eye position due to the moving display screen, so as to bring a higher naked eye 3D visual effect and enhance the viewer's viewing experience.
  • step S110 the eye position, display module parameters, and left-eye data stream and right-eye data stream are obtained.
  • the left eye data stream and the right eye data stream are usually determined according to the video signal, for example, they can be the left eye data stream and the right eye data stream parsed after the display device receives the video stream.
  • the video signal is a signal carrying the video playback content.
  • the key point is to use binocular parallax to project different video stream images to the left eye and the right eye, so that the human eye can produce a 3D visual effect.
  • the left eye data stream is used to control the 3D display module to output a video stream suitable for viewing by the left eye (such as a left eye visual image)
  • the right eye data stream is used to control the 3D module to output a video stream suitable for viewing by the right eye (such as a right eye visual image), thereby forming a naked eye 3D visual image on the 3D display module.
  • Video signals of different modes There are usually video signals of different modes, and different processing is required for different video signals of different modes.
  • a video signal of a 3D-FrameByFrame mode with asynchronous timing by adjusting the left-eye video stream and the right-eye video stream in the video signal, the generated left-eye data stream and the right-eye data stream can be synchronized, thereby avoiding the problem of poor naked-eye 3D visual effect caused by the asynchronous left-eye data stream and the right-eye data stream generated by the left-eye video stream and the right-eye video stream in the video signal.
  • the size of the video signal is affected by the input interface between the device generating the 3D content and the display system. Due to the constraints of the output interface, the video resolution is generally 8K. At this time, dividing the video signal into the left-eye video stream and the right-eye video stream will cause the generated left-eye data stream and the right-eye data stream to be data streams with a resolution less than or equal to 4K. Such data streams are output to the 3D display module, and the display effect is poor and the display resolution is low, resulting in a poor naked-eye 3D viewing experience.
  • the left-eye video stream and the right-eye video stream are horizontally stretched respectively.
  • their resolutions are effectively improved, so that a clearer frame image can be output on the 3D display module, thereby ensuring the display effect of the naked-eye 3D image.
  • the video signal may be a signal of 3D content, and the 3D content may be generated on a PC (Personal Computer), for example, on a 3D software (such as UE/Unity).
  • the video signal may also be generated by, for example, a laptop, a 3D content generation processor, or a cloud server, or even on some mobile terminals. The present disclosure does not limit the generation method of 3D content.
  • the device that generates the video signal may be a device on the display system; or it may be a device outside the display system, which sends the generated video signal to the display system so that the resolution of the video signal is 7680X4320@60Hz, and the video signal may be in 3D-SideBySide (parallel 3D signal) mode or 3D-FrameByFrame (sequential 3D signal) mode. Specifically, it may be output to the display system through the output interface HDMI 2.0X4.
  • the output video signal may include a video image with a video resolution of 7680X4320@60Hz.
  • the display system is provided with an HDMI_RX module electrically connected to the output interface, and the HDMI_RX module includes an HDMI 2.0X4 interface to receive the video signal.
  • the left eye data stream and the right eye data stream can be determined, wherein the left eye data stream is used to control the 3D display module to output a video stream suitable for left eye viewing (e.g., left eye visual image), and the right eye data stream is used to control the 3D display module to output a video stream suitable for right eye viewing (e.g., right eye visual image), so as to form a naked eye 3D visual image.
  • the left eye data stream is used to control the 3D display module to output a video stream suitable for left eye viewing (e.g., left eye visual image)
  • the right eye data stream is used to control the 3D display module to output a video stream suitable for right eye viewing (e.g., right eye visual image)
  • the eye position information can be identified by a visual recognition device to determine the eye position based on the eye position information.
  • the coordinates of the eye position in the established coordinate system can include the position coordinates of the left eye and the right eye.
  • the position coordinates may also include the position coordinates between the left eye and the right eye, and the position coordinates may be three-dimensional coordinates or two-dimensional coordinates.
  • a camera may be used to capture a facial image
  • an eye recognition device such as a SOC (System on Chip), a trained neural network, etc., may be used to recognize the facial image and determine the position of the eyes.
  • An integrated visual recognition device may also be used to directly recognize the eye position information, and determine the eye position based on the eye position information.
  • the visual recognition device may be used as part of a display system, or may be a device electrically connected to the outside of the display system, and the generated left eye position coordinates, right eye position coordinates, position coordinates between the left and right eyes, and other eye position coordinates are sent to the display system, so that the display system can obtain the eye position coordinates.
  • the display module can often be a 3D module.
  • the 3D module in this embodiment and other embodiments is a display module.
  • the display module includes a display panel, which can output images that produce a naked-eye 3D display effect, so that viewers can watch 3D images.
  • the naked eye 3D display module usually includes a lens 3D display module and a slit grating 3D display module.
  • the naked eye 3D display device based on the slit grating consists of a 2D liquid crystal display and a slit grating.
  • the naked eye 3D display device based on the cylindrical lens grating usually consists of a 2D liquid crystal display and a cylindrical lens grating; its display principle is similar to that of the slit grating stereo display, and both are achieved by encoding parallax images of different angles on the 2D display panel.
  • the cylindrical lens grating is usually composed of many cylindrical lenses with the same structure arranged in parallel. Since the cylindrical lens is usually made of transparent medium material, it will not block the light when modulating and encoding the 2D image. Compared with the slit grating, the naked eye 3D display based on the cylindrical lens grating has the advantage of high brightness.
  • the present application will take the lens 3D display module as an example to describe the spatial reality display method and spatial reality display system of the present application. However, as mentioned above, given that the imaging principles of the lens 3D display module and the slit grating 3D display module are basically similar, the spatial reality display method and spatial reality display system described below in this application are also equally applicable to the slit grating 3D display module.
  • the length, width and height of the display module, the surface center of the display module, the length, width and height of the LCD screen in the display module, the prism size and prism arrangement in the display module, the length direction of the display module, and the width of the display module are all physical property parameters of the display module, which are fixed for the manufactured display module and usually will not change due to routine movement or routine operation.
  • the parameters of the display module are usually stored in the memory of the display module or recorded in the corresponding position when leaving the factory.
  • the display device of this embodiment can obtain the parameters of the display module by reading the memory of the display module or by manual input, and can also determine the model of the display module by reading, and obtain the parameters of the display module in a space with the display module model record such as a database or the Internet.
  • the parameters of the display module at least include the center of the upper surface of the display module (i.e., the surface facing the user), the vertical direction of the upper surface of the display module, the width direction of the display module, the first width of the prism in the display module, the first distance from the prism surface to the liquid crystal screen in the display module, and the second width of the liquid crystal screen in the display module; the parameters of the display module may also include the second distance between the midpoint of each sub-pixel of the liquid crystal screen in the display module and the edge of the liquid crystal screen, etc.
  • the first distance from the prism surface to the liquid crystal screen in the display module may be equal to the thickness of the prism, or may not be equal to the thickness of the prism, which depends on the design of the display module and is also a physical property of the display module.
  • the above parameters of the display module are all known physical properties when leaving the factory.
  • step S120 a coordinate system is established according to the parameters of the display module.
  • the parameters of the display module include the center of the upper surface of the display module, the vertical direction of the upper surface of the display module, and the width direction of the display module.
  • other parameters may also be included, such as the length direction of the display module, the center of the LCD screen surface in the display module, etc.
  • a two-dimensional coordinate system is established with the center of the upper surface of the display module as the origin, the vertical direction of the upper surface of the display module as the Z coordinate axis direction, and the width direction of the display module as the X coordinate axis direction.
  • a three-dimensional coordinate system can also be established with the center of the upper surface of the display module as the origin, the vertical direction of the upper surface of the display module as the Z coordinate axis direction, the width direction of the display module as the X coordinate axis direction, and the length direction of the display module as the Y coordinate axis direction.
  • a two-dimensional or three-dimensional coordinate system can also be established with the center of the liquid crystal screen surface in the display module as the origin.
  • the coordinate system should at least include a coordinate system associated with the width direction of the display module and the vertical direction of the display module, so that the human eye position coordinates can be located in the two-dimensional or three-dimensional coordinate system, thereby determining the arrangement of the first sub-pixel group and the second sub-pixel group for the left eye data stream and the right eye data stream.
  • the establishment of the coordinate system can be adjusted as needed and is not limited here.
  • step S130 the attribute parameters of the periodic unit corresponding to the left eye view and the right eye view are determined according to the position of the human eye and the coordinates of the parameters of the display module in the coordinate system.
  • the attribute parameters of the periodic unit include the width of the periodic unit and the horizontal coordinate of the boundary of the periodic unit, wherein the width of the periodic unit is the sum of the widths of adjacent left eye views and right eye views in the left eye view and the right eye view.
  • the present disclosure describes a display module including multiple groups of prisms and a liquid crystal screen as an example, wherein the liquid crystal screen is far away from the viewer relative to the prisms, and the viewer usually obtains the display image of the liquid crystal screen through the prisms.
  • the parameters of the display module include relevant physical properties such as the first width of the prism in the display module and the first distance from the surface of the prism to the liquid crystal screen in the display module.
  • the first width of the prism in this embodiment is the width of a single prism.
  • the first width P of the prism is a property that has been determined during the design and manufacturing of the display module and can be directly obtained.
  • the first distance H from the surface of the prism to the liquid crystal screen can also be directly obtained based on the design parameters of the display module, wherein the first distance H is specifically the minimum distance between the surface of the prism away from the liquid crystal screen and the surface of the liquid crystal screen, that is, the first distance H can also be directly obtained based on the physical properties of the display module.
  • the periodic unit is the smallest unit composed of adjacent left-eye views and right-eye views in a plurality of left-eye views and right-eye views distributed on the liquid crystal screen of the display module, which is divided into a frame of image for realizing 3D display.
  • the left-eye view and the right-eye view are arranged alternately and a plurality of periodic units are arranged in sequence, forming the arrangement of the left-eye view and the right-eye view of the liquid crystal screen of the display module, so as to produce a naked-eye 3D display effect.
  • the left-eye view and the right-eye view are the left image and the right image divided by a frame of display screen in the left-eye data stream and the right-eye data stream correspondingly output by the first sub-pixel group and the second sub-pixel group on the liquid crystal screen in order to realize the 3D display effect.
  • the periodic unit is the smallest unit composed of adjacent left-eye views and right-eye views in the left-eye view and the right-eye view, and the periodic unit corresponds to at least the display screen. A first sub-pixel and a second sub-pixel on the surface.
  • the arrangement of the left eye view and the right eye view is determined by the periodic unit and the boundary of the periodic unit.
  • the arrangement is related to the width of the liquid crystal screen in the display module, or it can be said to be related to the sub-pixel arrangement in the width direction.
  • the horizontal coordinate of the boundary of the periodic unit can be directly used as the starting point, and the liquid crystal screen can be divided in the width direction according to the width of the periodic unit, so that the arrangement of the left eye view and the right eye view on the display screen can be obtained.
  • the boundary of the periodic unit can be the edge of the liquid crystal screen, or it can be the boundary at the junction between two periodic units.
  • the attribute parameters of the periodic unit corresponding to the left eye view and the right eye view are determined, that is, the width of the periodic unit and the horizontal coordinate of the boundary of the periodic unit are determined.
  • FIG2 shows a schematic diagram of implementing a spatial reality display method according to an embodiment of the present application in a coordinate system.
  • a three-dimensional coordinate system is established with the center of the screen (the center of the upper surface of the 3D module) as the origin, the direction perpendicular to the prism surface as the Z axis, the width direction of the upper surface of the prism as the X axis, and the length direction of the prism as the Y axis.
  • the coordinates of the human eye position are determined in the three-dimensional coordinate system.
  • the adaptive arrangement of the left-eye view and the right-eye view enables the viewer to see the left-eye view output by the left-eye data stream with the left eye at the current viewing angle (the black parts of the left-eye view and the right-eye view in FIG3 are the left-eye view), and the right-eye view can see the right-eye view output by the right-eye data stream (the white parts of the left-eye view and the right-eye view are the right-eye view).
  • the left-eye view and the right-eye view are generated by dividing each frame of the image relative to the display screen to achieve the naked-eye 3D display effect.
  • the left-eye view and the right-eye view are used as display effect diagrams in the present application. Since the left-eye view and the right-eye view are constructed by the first sub-pixel and the second sub-pixel, the left-eye view is constructed by the first sub-pixel when the left-eye data stream is input, and the right-eye view is constructed by the second sub-pixel when the right-eye data is input. Based on The left eye view and the right eye view can determine the first sub-pixel group and the second sub-pixel group.
  • the coordinates M1 (X1, Z1) of the midpoint of the line connecting the left eye and the right eye are calculated according to the following formulas (1) and (2) through the left eye position coordinates ML (XL, ZL) and the right eye position coordinates MR (XR, ZR) of the human eye position; the method in which the coordinates M1 (X1, Z1) of the midpoint of the line connecting the left eye and the right eye are already included in the human eye position coordinates is also feasible.
  • the widths of the periodic units of the left eye view and the right eye view are determined as follows:
  • the width of the prism corresponds to the width of a periodic unit of a left-eye view and a right-eye view displayed on a liquid crystal screen.
  • a periodic unit is composed of adjacent minimum units of left-eye view and minimum units of right-eye view.
  • the width of the periodic unit can be obtained in the following manner: the midpoint M1 of the left eye and right eye connection line is connected to the two ends of a prism to form two connection lines, and the two connection lines are extended to the surface of the liquid crystal screen to form two intersection points M2 and M3 with the liquid crystal screen.
  • the portion between the two intersection points M2 and M3 in the picture displayed by the liquid crystal screen is a periodic unit of the left eye view and the right eye view, that is, the minimum unit of the distribution of the left eye view and the right eye view.
  • the multiple left eye views and right eye views of a frame display picture are repeatedly arranged in the width direction of the display screen in units of periodic units, that is, the sub-pixels on the display screen are divided into corresponding first sub-pixel groups and second sub-pixel groups according to the arrangement of the left eye view and the right eye view on the display screen, and the left eye data stream is input to each first sub-pixel in the first sub-pixel group, and the right eye data stream is input to each second sub-pixel in the second sub-pixel group, so that the left eye view and the right eye view can be formed. Therefore, by determining the distance between the two intersection points M2 and M3, the width of a periodic unit of the left eye view and the right eye view can be determined.
  • the upper surface of the prism and the upper surface of the LCD screen are generally parallel, of course there may be negligible errors.
  • the midpoint of the line connecting the left eye and the right eye and the line connecting the two ends of a prism form a triangle
  • the midpoint of the line connecting the left eye and the right eye and the intersection points M2 and M3 form another triangle. Since the upper surface of the prism is generally parallel to the upper surface of the liquid crystal screen, that is, the bases of the two triangles are parallel, the two triangles form similar triangles.
  • the width ⁇ X of one period unit of the left eye view and the right eye view can be calculated by the following formula (3) and the formula (4) converted from the formula (3);
  • the abscissa of the boundary of the periodic unit that is, the abscissa of the intersection M2
  • the abscissa of M2 is the X-axis coordinate of M2.
  • one end of the prism is set to the center position of the surface of the display screen, that is, the origin position of the three-dimensional coordinate system, and the coordinate of the end is (0, 0).
  • the Z-axis coordinate Z2 of the intersection M2 is H.
  • the coordinates of the intersection M2 (the horizontal coordinate of M2 and the Z-axis coordinate of M2) and the origin of the coordinate system form a triangle.
  • the X-axis coordinate of the midpoint M1 of the line connecting the left eye and the right eye, the Z-axis coordinate of the midpoint M1 and the origin of the coordinate system form another triangle, and because the upper surface of the prism is generally parallel to the upper surface of the liquid crystal screen, that is, the bases of the two triangles are parallel, therefore, the two triangles form similar triangles.
  • the midpoint coordinates M1 (X1, Z1) of the line connecting the left eye and the right eye, and the first distance H from the prism surface to the liquid crystal screen i.e., the Z-axis coordinate Z2 of M2 is H
  • the horizontal coordinate X2 of the boundary of the periodic unit can be calculated and obtained by the following formula (5) and the formula (6) converted from formula (5):
  • the specific arrangement of all left-eye views and right-eye views of a frame of image can be determined according to the width of the liquid crystal screen.
  • S140 Determine a first sub-pixel group and a second sub-pixel group based on parameters of a display module, widths of periodic units of a left-eye view and a right-eye view, and horizontal coordinates of boundaries of periodic units, wherein a first sub-pixel in the first sub-pixel group is a sub-pixel that outputs a left-eye data stream in the display module, and a second sub-pixel in the second sub-pixel group is a sub-pixel that outputs a right-eye data stream in the display module.
  • the two sides of the boundary of the periodic unit correspond to a first sub-pixel and a second sub-pixel respectively.
  • the horizontal coordinate of the boundary of each periodic unit arranged along the width direction of the liquid crystal screen can be determined.
  • the horizontal coordinate of the boundary of each periodic unit determined is used to divide the center position coordinate of each sub-pixel by the width of the periodic unit to determine which periodic unit the sub-pixel corresponds to, and the remainder can be determined at the same time. Further, by judging the remainder, it can be determined whether the sub-pixel is the first sub-pixel or the second sub-pixel. Specifically, if the sub-pixel remainder is greater than the width of half the periodic unit, it is determined to be the second sub-pixel; if the sub-pixel remainder is less than or equal to the width of half the periodic unit, it is determined to be the first sub-pixel. Based on this, the first sub-pixel and the second sub-pixel can be accurately determined without physical adjustment. That is, the viewer can achieve a good naked-eye 3D viewing effect simply by controlling the left-eye data stream to be input into the first sub-pixel group and the right-eye data stream to be input into the second sub-pixel group.
  • the first sub-pixel group determined in this embodiment is a set of first sub-pixels, and the second sub-pixel group is also a set of second sub-pixels.
  • the first sub-pixel group and the second sub-pixel group can be determined, and the corresponding left-eye data stream and right-eye data stream are output based on the first sub-pixel group and the second sub-pixel group, so that the viewer can clearly see the left-eye data stream and the right-eye data stream.
  • the naked eye 3D display effect can be clearly seen.
  • the process of determining the rearrangement of the first sub-pixel and the second sub-pixel based on the change of the human eye position only involves the coordinates of the Z axis and the X axis, it is not necessary to obtain the Y axis coordinate value of the human eye position, and it is only necessary to obtain the coordinate values of the X axis and the Z axis of the human eye position. That is, in the present disclosure, it is only necessary to obtain the coordinate values of the X axis and the Z axis of the left eye and the coordinate values of the X axis and the Z axis of the right eye in the human eye position coordinates.
  • the coordinates of the midpoint of the line connecting the left eye and the right eye are used in the calculation process, that is, the human eye position coordinates can only refer to the three-dimensional coordinates of the midpoint of the line connecting the left eye and the right eye, or the X axis coordinates and the Z axis coordinates of the midpoint of the line connecting the left eye and the right eye.
  • the first sub-pixel group for outputting the left-eye data stream in the display module is determined by the eye position coordinates, and the second sub-pixel group for outputting the right-eye data stream in the display module is determined.
  • the left-eye data stream to be output in the first sub-pixel group and the right-eye data stream to be output in the second sub-pixel group a first image for the left eye and a second image for the right eye can be generated, thereby providing a higher naked-eye 3D visual effect according to the eye position, and improving the viewing experience of the viewer.
  • the eye position coordinates are re-determined, and then the first sub-pixel group and the second sub-pixel group are re-determined according to the eye position coordinates, the left eye data stream is controlled to be output in the first sub-pixel group, and the right eye data stream is controlled to be output in the second sub-pixel group, and the first image relative to the left eye and the second image relative to the right eye are regenerated.
  • the sub-pixels in the 3D display module are logically rearranged to adapt to the viewer's eye position, so that the viewer can clearly view the 3D image without additional physical adjustment of the display system, avoiding errors caused by physical adjustment, and also improving its display effect, which more effectively improves the user experience.
  • the parameters of the display module include the center of the upper surface of the display module, the vertical direction of the upper surface of the display module, and the width direction of the display module; according to the parameters of the display module, determining the coordinate system includes:
  • the center of the upper surface of the display module is taken as the origin, and the vertical direction of the upper surface of the display module is taken as the first coordinate.
  • the coordinate system is determined by taking the standard axis direction and the width direction of the display module as the second coordinate axis direction.
  • the parameters of the display module may include physical properties of the display module such as the center of the upper surface of the display module, the vertical direction of the upper surface of the display module, and the width direction of the display module, and may also include parameters such as the length direction of the display module.
  • the process of determining the pixel rearrangement of the first sub-pixel and the second sub-pixel only involves the coordinates of the Z axis and the X axis, it is not necessary to obtain the Y axis coordinate value in the human eye position coordinates, and only the X axis and Z axis coordinate values of the human eye position coordinates need to be obtained, that is, the X axis and Z axis coordinate values of the left eye and the X axis and Z axis coordinate values of the right eye of the human eye position coordinates.
  • a two-dimensional coordinate system is established with the center of the upper surface of the display module (the exact center of the upper surface of the 3D module) as the origin, the vertical direction of the upper surface of the display module as the Z-axis direction, and the width direction of the display module as the X-axis direction, and the human eye position coordinates are converted into the two-dimensional coordinate system. This makes it easy to calculate the width of the periodic unit and the horizontal coordinate of the boundary of the periodic unit in the attribute parameters of the periodic unit, reduce the amount of calculation, and improve the efficiency of 3D content processing.
  • a three-dimensional coordinate system can also be established based on the above two-dimensional coordinate system, with the length direction of the display module as the Y-axis direction, which will not be described in detail here.
  • the parameters of the display module include a first width of a prism in the display module and a first distance from a surface of the prism in the display module to a liquid crystal screen; determining the width of a periodic unit of a left-eye view and a right-eye view and the horizontal coordinate of a boundary of the periodic unit according to the coordinates of a human eye position, the parameters of the display module and a coordinate system includes:
  • the width of the periodic unit and the horizontal coordinate of the boundary of the periodic unit are determined according to the first width, the first distance, the coordinates of the human eye position and the coordinate system.
  • the first width of the prism in this embodiment is the width of a single prism.
  • the first width P of the prism can be directly determined according to the design of the prism, that is, it is a property that has been determined when the display module is designed and manufactured, and can be directly read.
  • the first distance H from the prism surface to the liquid crystal screen can also be directly obtained according to the design of the display module, wherein the first distance H is specifically the minimum distance between the surface of the prism away from the liquid crystal screen and the surface of the liquid crystal screen, that is, the first distance H can also be directly obtained according to the design of the display module. Characteristics obtained.
  • a coordinate system is established by the method of the above embodiment, and the coordinate system is a two-dimensional or three-dimensional coordinate system.
  • the eye position coordinates are converted into the coordinate system.
  • the first distance and the first width the periodic units of the left eye view for outputting the left eye data stream and the right eye view for outputting the right eye data stream on the display module are calculated by similar triangles.
  • any left eye view corresponds to at least one first sub-pixel for outputting the left eye data stream
  • any right eye view corresponds to at least one second sub-pixel for outputting the right eye data stream. Based on the first sub-pixel and the second sub-pixel output corresponding left eye data stream and right eye data stream, the viewer can clearly see the naked eye 3D effect.
  • the first width of the prism in the display module and the first distance from the prism surface to the liquid crystal screen in the display module are physical properties of the display module and can be easily obtained.
  • the width of the periodic unit and the horizontal coordinate of the boundary of the periodic unit can be easily and quickly calculated, so that the first sub-pixel group for displaying the left eye view and the second sub-pixel group for displaying the right eye view can be easily obtained without introducing other variables, thereby reducing the situation of system errors.
  • the parameters of the display module include a second width of the liquid crystal screen in the display module, and determining the first sub-pixel group and the second sub-pixel group according to the parameters of the display module, the width of the periodic unit of the left-eye view and the right-eye view, and the abscissa of the boundary of the periodic unit includes:
  • S310 obtaining a second width of the liquid crystal screen in the display module, and determining a first distribution margin of the liquid crystal screen according to the width of the periodic unit, the horizontal coordinate of the boundary of the periodic unit, and the second width;
  • the first distribution margin can be regarded as a remaining portion on the liquid crystal screen corresponding to an integer multiple of the periodic units, the width of the remaining portion on the X-axis is less than the width of one periodic unit, and after removing the remaining portion, the width of other portions of the liquid crystal screen is an integer multiple of the width of the periodic unit;
  • S320 Determine a first sub-pixel group and a second sub-pixel group according to the first distribution margin and the width of the period unit.
  • the display module includes a second width of the liquid crystal screen in the display module, wherein the second width of the liquid crystal screen in the display module is the overall width of the liquid crystal screen, that is, the overall width of the liquid crystal screen in the horizontal direction, for example, along FIG. 2 The width in the X-axis direction shown. According to the first distribution margin and the width of the periodic unit, the first sub-pixel group and the second sub-pixel group are determined.
  • the pixel arrangement of the liquid crystal screen is determined, therefore, after determining the horizontal coordinates according to the width of the periodic unit and the boundary of the periodic unit, when the horizontal coordinates of the boundary of the periodic unit are calculated by the width of the periodic unit, there may be a situation where it exceeds the edge of the liquid crystal screen, that is, the width of the liquid crystal screen cannot be evenly divided into several periodic units, and several periodic units are evenly arranged, so there is no margin for the left eye view and the right eye view at the edge of the liquid crystal screen, that is, the first distribution margin.
  • the impact on the first sub-pixel and the second sub-pixel needs to be considered. If the liquid crystal screen is completely arranged by the periodic unit by the width of the periodic unit and the horizontal coordinates of the boundary of the periodic unit, that is, all sub-pixels of the entire liquid crystal screen can generate corresponding left eye views and right eye views, then there is no need to consider the margin problem.
  • the second width W of the LCD screen is the overall width of the LCD screen, which is a physical property of the display module and can be directly obtained.
  • the first distribution margin of the liquid crystal screen is determined according to the width of the periodic unit, the horizontal coordinate of the boundary of the periodic unit, and the second width. Since the arrangement of the periodic units of the left eye view and the right eye view is based on the horizontal coordinate of the boundary of the periodic unit determined by calculation as the starting point, they are arranged in sequence to both sides according to the width of the periodic unit. In this embodiment, it is necessary to consider the influence of the horizontal coordinate of the boundary of the periodic unit on the first distribution margin ⁇ B.
  • margins of the left eye view and the right eye view at the edge of the liquid crystal screen that is, there is a part in the edge of the liquid crystal screen, which cannot cover the width of a periodic unit after the width of the periodic unit is arranged in sequence as mentioned above, for example, it may cover the width of half a periodic unit, or the width of a quarter of the periodic unit.
  • This part of the margin is referred to as the first distribution margin in this application.
  • the existence of the first distribution margin ⁇ B at the edge of one side of the liquid crystal screen is used as an example to discuss how to calculate the first distribution margin.
  • X2 is the horizontal coordinate of the boundary of the period unit
  • ⁇ X is the width of the period unit
  • W is the second width of the liquid crystal screen.
  • the first distribution margin ⁇ B can be obtained by formula (7). Specifically, after each periodic unit is determined, since the position of each sub-pixel of the LCD screen is determined, the width position of each sub-pixel plus the first distribution margin divided by the width of the periodic unit can not only determine which periodic unit the sub-pixel corresponds to, but also determine the remainder; by judging the remainder, it can be further determined whether the sub-pixel is the first sub-pixel or the second sub-pixel. Specifically, if the sub-pixel remainder is greater than the width of half the periodic unit, it is determined to be the second sub-pixel; if the sub-pixel remainder is less than or equal to the width of half the periodic unit, it is determined to be the first sub-pixel.
  • the first sub-pixel and the second sub-pixel can be accurately determined without physical adjustment.
  • the viewer can achieve the naked-eye 3D display effect by controlling the left-eye data stream to be input into the first sub-pixel group and the right-eye data stream to be input into the second sub-pixel group.
  • the parameters of the display module further include a second distance between the midpoint of each sub-pixel of the liquid crystal screen in the display module and the edge of the liquid crystal screen; determining the first sub-pixel group and the second sub-pixel group according to the first distribution margin and the width of the periodic unit includes:
  • S410 determining a second distribution margin of a sub-pixel according to the second distance, the first distribution margin, and the width of the period unit;
  • the physical position of each pixel on the LCD screen is determined.
  • implementing sub-pixel rearrangement refers to adjusting the output content of the sub-pixel input left eye view or right eye view, for example, using a sub-pixel as the first sub-pixel to display the left eye view or the second sub-pixel to display the right eye view.
  • rearranging the sub-pixels in the LCD screen can generate a naked eye 3D effect that adapts to the viewer's eye position, and what is essentially changed is the image content output by each sub-pixel.
  • a pixel generally includes three types of sub-pixels, namely, a red sub-pixel R, a green sub-pixel G, and a blue sub-pixel B.
  • the second distance between the midpoint of each sub-pixel and the edge of the liquid crystal screen is determined by the following formulas (8)-(10).
  • ⁇ RJ is the second distance between the midpoint of the red sub-pixel and the edge of the LCD screen
  • ⁇ GJ is the second distance between the midpoint of the green sub-pixel and the edge of the LCD screen
  • ⁇ BJ is the second distance between the midpoint of the blue sub-pixel and the edge of the LCD screen
  • J represents the pixel in the column
  • N is the pixel width.
  • a pixel includes two red sub-pixels, two green sub-pixels and two blue sub-pixels arranged side by side. Therefore, according to the arrangement of the sub-pixels in the pixel, the red sub-pixel is subtracted from the green sub-pixel.
  • the green sub-pixel needs to be subtracted
  • the blue subpixel needs to be subtracted
  • Position R is the second distribution margin of the red sub-pixel
  • Position G is the second distribution margin of the green sub-pixel
  • Position B is the second distribution margin of the blue sub-pixel
  • ⁇ RJ is the second distribution margin of the red sub-pixel
  • ⁇ GJ is the second distance between the midpoint of the green sub-pixel and the edge of the LCD screen
  • ⁇ BJ is the second distance between the midpoint of the blue sub-pixel and the edge of the LCD screen
  • ⁇ X is the width of the periodic unit
  • ⁇ B is the first distribution margin.
  • each sub-pixel belongs to the first sub-pixel or the second sub-pixel according to the following:
  • the green sub-pixel is determined as the first sub-pixel for inputting the right eye data stream.
  • the first sub-pixel and the second sub-pixel can be accurately determined without physical adjustment of the display module.
  • the viewer can watch the 3D content with naked eyes in 3D simply by controlling the left-eye data stream to be input to the first sub-pixel and the right-eye data stream to be input to the second sub-pixel.
  • the spatial reality display method further includes:
  • a left-eye data stream and a right-eye data stream are determined.
  • the video signal may be a signal of 3D content, and the 3D content may be generated on a PC (Personal Computer), for example, on a 3D software (such as UE/Unity).
  • the video signal may also be processed and generated by, for example, a laptop, a 3D content generation processor, or a cloud server, or may even be generated on some mobile terminals.
  • the present disclosure does not limit the generation method of 3D content.
  • the device that generates the video signal may be a device on the display system; or it may be a device outside the display system, which sends the generated video signal to the display system so that the resolution of the video signal is 7680X4320@60Hz, and the video signal may be in 3D-SideBySide (parallel 3D signal) mode or 3D-FrameByFrame (sequential 3D signal) mode. Specifically, it may be output to the display system through the output interface HDMI2.0X4.
  • the output video signal may include a video image with a video resolution of 7680X4320@60Hz.
  • the display system is provided with an HDMI_RX module electrically connected to the output interface, and the HDMI_RX module includes an interface of HDMI 2.0X4 to receive the video signal.
  • the left eye data stream and the right eye data stream can be determined, wherein the left eye data stream is used to control the 3D display module to output a video stream suitable for left eye viewing (e.g., left eye visual image), and the right eye data stream is used to control the 3D display module to output a video stream suitable for right eye viewing (e.g., right eye visual image), so as to form a naked eye 3D visual image.
  • the left eye data stream is used to control the 3D display module to output a video stream suitable for left eye viewing (e.g., left eye visual image)
  • the right eye data stream is used to control the 3D display module to output a video stream suitable for right eye viewing (e.g., right eye visual image)
  • determining the left-eye data stream and the right-eye data stream according to the video signal includes:
  • l represents an image of a single frame in the first video stream
  • r represents an image of a single frame in the second video stream
  • L represents an image of a single frame in the left eye data stream
  • R represents an image of a single frame in the right eye data stream.
  • Whether the video signal is in the first mode or the second mode may be determined by manual input or by analyzing the video signal.
  • the first video stream and the second video stream are asynchronous video streams.
  • the left eye video stream and the right eye video stream in the video signal are asynchronous, the generated naked eye 3D image is poor.
  • a first space and a second space are opened up in the storage space of the display system, the first space is an odd frame address space, and the second space is an even frame address space; vice versa, the first space is an even frame address space, and the second space is an odd frame address space.
  • the size of the first space and the size of the second space are determined based on the physical resolution of the 3D display module.
  • the first video stream is stored in the first space
  • the second video stream is stored in the second space, that is, the odd-numbered frame address space is used to store the left-eye data
  • the even-numbered frame address space is used to store the right-eye data.
  • the left-eye video stream is stored in the odd-numbered frame address space
  • the right-eye video stream is stored in the even-numbered frame address space.
  • the specified period can be a period of 2 frames, or a period of 4 frames. Of course, the specified period can also be set according to actual conditions.
  • the asynchronous first video stream and the second video stream can be balanced within the specified period. That is, within the specified period, read out the first video stream and the second video stream of the same frame to generate a synchronized left eye data stream and a right eye data stream. Outputting synchronized left eye data streams and right eye data streams can avoid generating poor naked eye 3D images.
  • determining the left-eye data stream and the right-eye data stream according to the video signal includes:
  • the mode of the video signal is the second mode
  • the third video stream and the fourth video stream are stretched at a specified ratio to generate a left-eye data stream and a right-eye data stream, wherein the specified ratio is greater than or equal to 2.
  • the synchronized third video stream and the fourth video stream can be obtained by parsing the video signal.
  • the third video stream and the fourth video stream are parallel synchronized video streams.
  • the video signal of the 3D-SideBySide (parallel 3D signal) mode with the same timing, the size of which is subject to the constraints between the input interface and the output interface between the device generating the 3D content and the display system, and is generally an 8K resolution video.
  • dividing the video signal into a left-eye video stream and a right-eye video stream will result in the generated left-eye data stream and right-eye data stream being data streams with a resolution of less than or equal to 4K.
  • Such a data stream is output to the 3D display module, and the display effect will be poor, and the displayed resolution will be low, resulting in a poor viewing experience of naked-eye 3D.
  • the third video stream and the fourth video stream can be stretched at a specified ratio, and the resolution of the left-eye data stream and the right-eye data stream is effectively improved after horizontal stretching, so that a clearer frame image can be output on the 3D display module, thereby ensuring the display effect of the naked-eye 3D image.
  • the video signal is identified as a video signal in the third mode, and the video signal in the third mode is a 2D video stream.
  • the 2D video stream can be directly output to the 3D display module.
  • the spatial reality display method may further include:
  • a video signal is determined.
  • the generation of the video signal can be completed by the display system, that is, 3D content is generated on other 3D content generating devices, and then the generated 3D content is sent to the display system.
  • the display system obtains the left eye three-dimensional coordinates and the right eye three-dimensional coordinates from the visual recognition device.
  • a three-dimensional coordinate system is established with the center of the screen (the center of the upper surface of the 3D display module) as the origin, the direction perpendicular to the prism surface as the Z axis, the width direction of the upper surface of the prism as the X axis, and the length direction of the prism as the Y axis.
  • the left-eye 3D coordinates and the right-eye 3D coordinates adjust the output 3D content, thereby generating a video signal.
  • the device for generating 3D content can also be used as part of the display system.
  • the left eye 3D coordinates and the right eye 3D coordinates are obtained from the visual recognition device, and the 3D content is generated as video information according to the left eye 3D coordinates and the right eye 3D coordinates.
  • the 3D content can be adjusted according to the viewer's viewing angle, and then the first sub-pixel and the second sub-pixel used to display the 3D content are adjusted, so that the viewer can experience a better naked-eye 3D visual effect.
  • the embodiment of the present application provides a spatial reality display system. As shown in FIG7 , the spatial reality display system 700 of the embodiment of the present application is used to execute the spatial reality display method of the embodiment of the present application.
  • the display system 700 may be a display module for naked-eye 3D, for example, the spatial reality display method is executed by a processor, a processing chip, etc. in the display module, and displayed in the display module.
  • the display system may also include another controller or processor, which controls the display module for naked-eye 3D to output a left-eye data stream through a first sub-pixel group and a right-eye data stream through a second sub-pixel group after executing the spatial reality display method on the display system 700.
  • the spatial reality display system of this embodiment by tracking the coordinates of the human eye position, determines the first sub-pixel and the second sub-pixel of the liquid crystal screen in the display module in real time, and by outputting the left eye data stream in the first sub-pixel and the right eye data stream in the second sub-pixel, can output the left eye data stream in the first sub-pixel and the right eye data stream in the second sub-pixel in real time according to the coordinates of the human eye position, that is, the sub-pixels in the display module are redistributed according to the coordinates of the human eye to adapt to the viewer's eye position.
  • the viewer can clearly watch the naked eye 3D at different angles or different positions without moving the display screen, avoiding the problem of poor naked eye 3D effect caused by the difficulty in accurately locating the human eye position due to the moving display screen, so as to bring a higher naked eye 3D visual effect and enhance the viewer's viewing experience.
  • the display system 700 includes:
  • the display device 710 includes a display module for 3D display, which includes a plurality of sub-pixels, and
  • the signal output device 720 is electrically connected to the display device 710, and is used to transmit the left-eye video stream and the right-eye video stream generated based on the human eye position to the first pixel group and the second pixel group in the display device 710 respectively.
  • the signal output device 720 is, for example, a PC (Personal Computer), a laptop, a mobile terminal, a 3D content generation processor, or a cloud server.
  • 3D content generated on a PC's 3D software such as 3D content generated on UE/Unity, generates a 3D video signal based on the 3D content.
  • the display device 710 may be integrated with a memory and a processor. Therefore, the spatial reality display method disclosed above may be stored in a memory as a program instruction, and the spatial display method is implemented by executing the program instruction stored in the memory through the processor. Specifically, the processor determines the first sub-pixel group for receiving the left-eye video stream and the second sub-pixel group for receiving the right-eye video rate on the display screen of the display module based on the left-eye video stream and the right-eye video rate output by the signal output device 720 and the physical parameters of the 3D display module included in the display device 710 stored in the memory to realize 3D display.
  • the display system 700 further includes:
  • the human eye recognition device 730 is electrically connected to the display device 710 and the signal output device 720 , and is used to output the human eye position coordinates to the display device 710 and to output the human eye position coordinates to the signal output device 720 .
  • the human eye recognition device 730 can be an integrated visual recognition device, which captures the facial image through its own camera, and determines the three-dimensional coordinates of the left eye and the right eye in the human eye position coordinates through methods such as a trained neural network, and sends them to the display device 710 and the signal output device 720 respectively, so that the display device 710 and the signal output device 720 can execute the above-mentioned spatial reality display method.
  • the human eye recognition device 730 further includes:
  • a collection unit 731 is used to collect human eye information
  • the recognition unit 732 is electrically connected to the acquisition unit 731, the display device 710 and the signal output device 720, and is used to recognize the human eye information and determine the human eye coordinate information, which includes the human eye position coordinate information.
  • the three-dimensional coordinates of the left eye and the three-dimensional coordinates of the right eye are output to the display device 710, and the three-dimensional coordinates of the left eye and the three-dimensional coordinates of the right eye are output to the signal output device 720.
  • the acquisition unit 731 may be a camera, a webcam, etc., and a webcam may be used to acquire a face image.
  • the recognition unit 732 may be an eye recognition device, such as a SOC (System on Chip), a trained neural network, etc., to recognize the face image and determine the eye position coordinates.
  • An integrated visual recognition device may also be used to directly recognize the eye coordinate information, which includes the eye position coordinates, the left eye three-dimensional coordinates, and the right eye three-dimensional coordinates.
  • the display device 700 can obtain the eye position coordinates, and the signal output device 720 can obtain the left eye three-dimensional coordinates and the right eye three-dimensional coordinates, so that the display device 700 and the signal output device 720 can execute the corresponding spatial reality display method.
  • first and second are used for descriptive purposes only and should not be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Therefore, a feature defined as “first” or “second” may explicitly or implicitly include one or more of the features. In the description of this application, the meaning of “plurality” is two or more, unless otherwise clearly and specifically defined.
  • the terms “installed”, “connected”, “connected”, “fixed” and the like should be understood in a broad sense, for example, they can be fixedly connected or removably connected. It can be connected or integrated; it can be mechanically connected, electrically connected, or communicated; it can be directly connected, or indirectly connected through an intermediate medium, it can be the internal connection of two elements or the interaction relationship between two elements. For ordinary technicians in this field, the specific meanings of the above terms in this application can be understood according to specific circumstances.
  • a first feature being “above” or “below” a second feature may include that the first and second features are in direct contact, or may include that the first and second features are not in direct contact but are in contact through another feature between them.
  • a first feature being “above”, “above” and “above” a second feature includes that the first feature is directly above and obliquely above the second feature, or simply indicates that the first feature is higher in level than the second feature.
  • a first feature being “below”, “below” and “below” a second feature includes that the first feature is directly above and obliquely above the second feature, or simply indicates that the first feature is lower in level than the second feature.
  • the logic and/or steps represented in the flowchart or otherwise described herein, for example, can be considered as an ordered list of executable instructions for implementing logical functions, which can be embodied in any computer-readable medium for use by an instruction execution system, apparatus or device (such as a computer-based system, a system including a processor or other system that can fetch instructions from an instruction execution system, apparatus or device and execute instructions), or used in combination with these instruction execution systems, apparatuses or devices.
  • each functional unit in each embodiment of the present application can be integrated into a processing module, or each unit can exist physically separately, or two or more units can be integrated into one module.
  • the above-mentioned integrated module can be implemented in the form of hardware or in the form of a software functional module. If the above-mentioned integrated module is implemented in the form of a software functional module and sold or used as an independent product, it can also be stored in a computer-readable storage medium.
  • the storage medium can be a read-only memory, a disk or an optical disk, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

本公开实施例提供一种空间现实显示方法及空间现实显示系统,该空间现实显示方法包括:基于显示模组的参数建立坐标系,通过人眼位置和所述显示模组的参数在所述坐标系中的坐标,实时确定用于实现3D显示的左眼视图和右眼视图对应的周期单元的属性参数。通过显示模组的参数以及左眼视图和右眼视图对应的周期单元的属性参数,能够确定第一子像素组和第二子像素组,通过第一子像素组和第二子像素组在显示模组上分别输出左眼数据流和右眼数据流,即实现了根据人眼位置对显示模组内的各子像素进行了重新分配,以适配观看者的人眼位置。

Description

空间现实显示方法、空间现实显示系统以及非易失性计算机可读存储介质
相关申请的交叉引用
本申请要求要求于2022年10月31日提交的中国专利申请No.202211351069.X的优先权,其内容在此通过引用方式整体并入本申请。
技术领域
本申请涉及空间现实显示系统的技术领域,尤其涉及一种空间现实显示方法及空间现实显示系统、非易失性计算机可读存储介质。
背景技术
裸眼3D显示是对不借助偏振光眼镜等外部工具,实现立体视觉效果的技术的统称。相关3D显示技术通过移动显示屏的位置来适配观看者的观看角度,从而实现较好的裸眼3D显示效果。但是通过移动显示屏的位置适配观看者的方式中,移动显示屏往往难以定位到人眼合适的位置,导致裸眼3D显示效果较差。
发明内容
本申请实施例提供一种空间现实显示方法及空间现实显示系统、非易失性计算机可读存储介质,以解决或缓解现有技术中的一项或更多项技术问题。
作为本申请实施例的一个方面,本申请实施例提供一种空间现实显示方法,包括:
获取人眼位置、包括多个子像素的显示模组的参数和用于实现3D显示的左眼数据流、右眼数据流;
根据显示模组的参数,建立坐标系;
根据人眼位置和显示模组的参数在坐标系中的坐标,确定左眼视图和右眼视图对应的周期单元的属性参数,周期单元的属性参数包括周期单元的宽度和周期单元的边界的横坐标,其中,周期单元的宽度为左眼视图和右眼视图中相邻左眼视图和右眼视图的宽度之和,周期单元的边界的横坐标为两个相邻周期单元之间边界处的横坐标;
根据显示模组的参数、左眼视图和右眼视图的周期单元的宽度和周期单元的边界的横坐标,将所述多个子像素划分为第一子像素组和第二子像素组,其中,第一子像素组为显示模组中输出左眼数据流的子像素,第二子像素组为在显示模组中输出右眼数据流的子像素。
作为本申请实施例的另一个方面,本申请实施例还提供一种空间现实显示系统,用于执行本申请的所有实施例的空间现实显示方法。
作为本申请实施例的另一个方面,本申请实施例还提供一种非易失性计算机可读存储介质,其上存储的程序被处理器执行时能够执行本申请的所有实施例的空间现实显示方法。
本申请实施例采用上述技术方案可以得到如下有益效果:
在本实施例中,基于显示模组的参数确定坐标系,通过人眼位置坐标、显示模组的参数和坐标系,实时确定左眼视图和右眼视图对应的周期单元的属性参数。基于左眼视图和右眼视图对应的周期单元的属性参数,即确定出显示模组的用于实现3D显示的左眼视图和右眼视图。在确定了左眼视图和右眼视图的情况下,通过显示模组的参数以及左眼视图和右眼视图对应的周期单元的属性参数,能够确定第一子像素组和第二子像素组,通过第一子像素组和第二子像素组在显示模组上输出左眼数据流和右眼数据流,即实现了根据人眼的坐标位置对显示模组内的像素进行了重新分布,以适配观看者的人眼位置。使得观 看者在不同角度或不同位置的情况下,都能够清晰地观看到裸眼3D,而且无需对显示屏进行移动,避免了因为移动显示屏造成难以准确定位人眼位置,导致裸眼3D显示效果较差的问题,以便能够带来较高的裸眼3D视觉效果,提升观看者的观看体验。
上述概述仅仅是为了说明书的目的,并不意图以任何方式进行限制。除上述描述的示意性的方面、实施方式和特征之外,通过参考附图和以下的详细描述,本申请进一步的方面、实施方式和特征将会是容易明白的。
附图说明
在附图中,除非另外规定,否则贯穿多个附图相同的附图标记表示相同或相似的部件或元素。这些附图不一定是按照比例绘制的。应该理解,这些附图仅描绘了根据本申请公开的一些实施方式,而不应将其视为是对本申请范围的限制。
图1示出根据本申请实施例的空间现实显示方法的流程图。
图2示出在坐标系中实现根据本申请实施例的空间现实显示方法的示意图。
图3示出根据本申请实施例的空间现实显示方法的流程图。
图4示出根据本申请实施例的空间现实显示方法的流程图。
图5示出根据本申请实施例的空间现实显示方法的流程图。
图6为图5中的实施例的空间现实显示方法的时序示意图。
图7示出根据本申请实施例的空间现实显示系统的结构示意图。
附图标记说明:
700、显示系统;710、显示装置;720、信号输出装置;730、人眼识别装
置;731、采集装置;732、识别装置。
具体实施方式
在下文中,仅简单地描述了某些示例性实施例。正如本领域技术人员可认识到的那样,在不脱离本申请的精神或范围的情况下,可通过各种不同方式修改所描述的实施例。因此,附图和描述被认为本质上是示例性的而非限制性的。
图1示出根据本申请实施例的空间现实显示方法的流程图,如图1所示,该空间现实显示方法包括:
S110:获取人眼位置、显示模组的参数和左眼数据流、右眼数据流。
S120:根据显示模组的参数,建立坐标系。
S130:根据人眼位置、显示模组的参数在坐标系中的坐标,确定左眼视图和右眼视图对应的周期单元的属性参数,周期单元的属性参数包括周期单元的宽度和周期单元的边界的横坐标,其中,周期单元的宽度为左眼视图和右眼视图中相邻左眼视图和右眼视图的宽度之和。
S140:根据显示模组的参数、左眼视图和右眼视图的周期单元的宽度和周期单元的边界的横坐标,确定第一子像素组和第二子像素组,其中,第一子像素组中的第一子像素为显示模组中输出左眼数据流的子像素,第二子像素组中的第二子像素为在显示模组中输出右眼数据流的子像素。
本实施例的空间现实显示方法可以适用于显示系统上,具体可以为直接在裸眼3D的显示模组上执行,例如通过显示模组内的处理器、处理芯片等执行该空间现实显示方法,并在显示模组上显示。也可以通过另外的控制器或处理器执行空间现实显示方法后,控制裸眼3D的显示模组在第一子像素上输出左眼数据流和在第二子像素上输出右眼数据流。显示模组可以为显示屏,显示器等,用于供裸眼3D显示的设备。下述实施例中以显示系统为执行主体进行举例说明。
在本实施例中,基于显示模组的参数建立坐标系,通过人眼位置、显示模组的各物理参数在坐标系上的坐标,实时地(即,针对视频中的每一帧图像)确定左眼视图和右眼视图对应的周期单元的属性参数。基于左眼视图和右眼视 图对应的周期单元的属性参数,即可以确定出显示模组的用于实现3D显示的左眼视图和右眼视图。在确定了左眼视图和右眼视图的情况下,通过显示模组的参数以及左眼视图和右眼视图对应的周期单元的属性参数,能够确定第一子像素组和第二子像素组,通过第一子像素组和第二子像素组在显示模组上分别输出左眼数据流和右眼数据流,即实现了根据人眼位置对显示模组内的子像素进行了重新分配,以适配观看者的人眼位置。使得观看者在不同角度或不同位置的情况下,都能够清晰地观看到裸眼3D显示,而无需移动显示屏,避免了因为移动显示屏造成难以准确地定位人眼位置而导致的裸眼3D显示效果较差的问题,以便带来较高的裸眼3D视觉效果,提升观看者的观看体验。
在步骤S110中,获取人眼位置、显示模组的参数和左眼数据流、右眼数据流。
左眼数据流和右眼数据流通常根据视频信号确定的,例如可以是在显示装置接收到视频流后解析出的左眼数据流和右眼数据流。视频信号为搭载视频播放内容的信号。对于裸眼3D而言,其关键点是利用双目视差,通过向左眼和右眼投影不同的视频流的图像,从而能够使得人眼产生3D的视觉效果。即需要将视频信号进行解析操作,将视频信号中的数据流划分为左眼数据流和右眼数据流,其中,左眼数据流用于控制3D显示模组输出适于左眼观看的视频流(例如左眼视觉图像),右眼数据流用于控制3D模组输出适于右眼观看的视频流(例如右眼视觉图像),从而在3D显示模组上形成裸眼的3D视觉影像。
通常存在不同模式的视频信号,根据不同模式的视频信号需要进行不同处理。例如,对于时序不同步的3D-FrameByFrame模式的视频信号而言,通过调整视频信号中的左眼视频流和右眼视频流,使得生成的左眼数据流和右眼数据流能够同步,避免由于视频信号中左眼视频流和右眼视频流生成的左眼数据流和右眼数据流不同步导致裸眼3D视觉效果较差的问题。
又例如,对于时序同步的3D-SideBySide(并列型3D信号)模式的视频信号而言,视频信号的大小受到生成3D内容的设备与显示系统之间的输入接口 和输出接口的制约,一般为8K分辨率的视频。此时将视频信号分割为左眼视频流和右眼视频流,会导致生成的左眼数据流和右眼数据流均为小于等于4K分辨率的数据流,这样的数据流输出到3D显示模组中,显示的效果较差,显示的分辨率较低,造成裸眼3D的观看体验较差。在处理视频信号时,在将视频信号分割为左眼视频流和右眼视频流后,分别对左眼视频流和右眼视频流进行水平拉伸处理,左眼数据流和右眼数据流经过水平拉伸后,其分辨率得到有效的提升,从而能够在3D显示模组上输出较为清晰的帧图像,进而能够确保裸眼3D图像的显示效果。
在一些实施例中,视频信号可以为3D内容的信号,该3D内容可以在PC(Personal Computer,个人计算机)上生成,例如可以在3D软件(如UE/Unity)上生成。在一些实施例中,视频信号例如也可以由笔记本电脑、3D内容生成处理器或者云端服务器生成,甚至可以在一些移动终端上生成。本公开对于3D内容的生成方式不做限定。生成视频信号的设备可以为显示系统上设备;也可以是显示系统外设备,其通过将生成的视频信号发送给显示系统,使得视频信号的分辨率为7680X4320@60Hz,视频信号可以为3D-SideBySide(并列型3D信号)模式或3D-FrameByFrame(时序型3D信号)模式。具体可以通过输出接口HDMI 2.0X4输出到显示系统。输出的视频信号中可以包括视频分辨率7680X4320@60Hz的视频图像。相对应地,显示系统上设有与输出接口相对应的电连接的HDMI_RX模块,HDMI_RX模块包括HDMI 2.0X4接口来接收视频信号。
根据视频信号,能够确定左眼数据流和右眼数据流,其中,左眼数据流用于控制3D显示模组输出适于左眼观看的视频流(例如左眼视觉图像),右眼数据流用于控制3D显示模组输出适于右眼观看的视频流(例如右眼视觉图像),以便形成裸眼的3D视觉影像。
可以通过视觉识别装置识别人眼位置信息,以基于人眼位置信息确定出人眼位置。人眼位置在所建立的坐标系中的坐标可以包括左眼的位置坐标和右眼 的位置坐标,还可以包括左眼和右眼中间的位置坐标,该位置坐标可以是三维坐标,也可以是二维坐标。在一些实施例中,可以采用摄像头采集到人脸图像,通过眼部识别装置,例如SOC(System on Chip,系统级芯片)、训练好的神经网络等对人脸图像进行识别,确定出人眼位置。也可以采用一体式的视觉识别装置,直接识别到人眼位置信息,基于人眼位置信息,确定人眼位置。该视觉识别装置可以作为显示系统的一部分,也可以是与显示系统外部电连接的装置,将生成的左眼位置坐标和右眼位置坐标、左右眼中间的位置坐标等人眼位置坐标发送给显示系统,使得显示系统能够获取到人眼位置坐标。
显示模组常可以为3D模组,本实施例以及其他实施例中的3D模组为显示模组。显示模组包括显示面板,能够输出产生裸眼3D显示效果的图像,使得观看者能够观看到3D图像。
裸眼3D显示模组通常包括透镜3D显示模组和狭缝光栅3D显示模组。基于狭缝光栅的裸眼3D显示设备由2D液晶显示器与狭缝光栅两部分组成。通过在2D显示器上加载多个视点的图像编码信息,可以让不同的视差图像在空间中不同位置处成像,从而实现裸眼3D的显示效果;基于柱透镜光栅的裸眼3D显示设备通常由2D液晶显示器与柱透镜光栅两部分组成;其显示原理与狭缝光栅立体显示器类似,都是通过在2D显示面板上编码不同角度的视差图像实现立体。柱透镜光栅通常是由许多结构相同的柱面透镜平行排列而成。由于柱面透镜通常采用透明介质材料制作,因此在调制编码2D图像时对光线不会有遮挡作用。相比于狭缝光栅,基于柱透镜光栅的裸眼3D显示具有亮度高的优点。以下,本申请将以透镜3D显示模组为例对本申请的空间现实显示方法及空间现实显示系统进行描述。但是,如上所述,鉴于透镜3D显示模组和狭缝光栅3D显示模组的成像原理基本类似,因此,本申请以下记载的空间现实显示方法及空间现实显示系统也同样适用于狭缝光栅3D显示模组。
显示模组的长宽高、显示模组的表面中心、显示模组中液晶屏的长宽高、显示模组中的棱镜尺寸以及棱镜排布、显示模组的长度方向、显示模组的宽度 方向、显示模组的表面垂直方向以及显示模组中液晶屏内的像素排布等等,都属于显示模组的物理属性参数,对于制成的显示模组而言是确定的,通常不会因为常规移动或者对其进行常规操作而发生改变。对于显示模组的参数,通常在出厂的时候会存储于显示模组的存储器中,或者是记录在相应的位置上。本实施例的显示装置可以通过读取显示模组的存储器或者是通过人为输入的方式获取到显示模组的参数,也可以通过读取确定显示模组的型号,在数据库或者是互联网等具有该显示模组型号记录的空间内获取到显示模组的参数。
本实施例中,显示模组的参数至少包括显示模组上表面(即,朝向用户的表面)的中心、显示模组上表面的垂直方向、显示模组的宽度方向、显示模组中棱镜的第一宽度、显示模组中棱镜表面至液晶屏的第一距离、显示模组中液晶屏的第二宽度;显示模组的参数还可以包括显示模组中液晶屏的各子像素的中点与液晶屏的边缘的第二距离等。其中,显示模组中棱镜表面至液晶屏的第一距离可以等于棱镜的厚度,也可以不等于棱镜厚度,这取决于显示模组的设计,也属于显示模组的物理属性。上述显示模组的参数在出厂时均为已知的物理属性。
在步骤S120中,根据显示模组的参数,建立坐标系。
如上述针对显示模组的描述,可以知道,显示模组的参数包括显示模组上表面的中心、显示模组上表面的垂直方向以及显示模组的宽度方向,当然也可以包括其他参数,例如显示模组的长度方向、显示模组中液晶屏表面的中心等。
本实施例中以显示模组上表面的中心为原点、显示模组上表面的垂直方向为Z坐标轴方向以及显示模组的宽度方向为X坐标轴方向,建立二维坐标系。在可选实施例中,也可以以显示模组上表面的中心为原点、显示模组上表面的垂直方向为Z坐标轴方向、显示模组的宽度方向为X坐标轴方向和显示模组的长度方向为Y坐标轴方向建立三维坐标系。在可选实施例中,还可以以显示模组中液晶屏表面的中心为原点建立二维或三维坐标系。在本公开中,需要确定针对左眼数据流和右眼数据流的子像素排布,还需要确定对应的左眼和右眼位 置坐标。对此,该坐标系至少应当包括与显示模组的宽度方向、显示模组的垂直方向相关联的坐标系,以便能够将人眼位置坐标定位在该二维或者三维坐标系中,从而能够确定针对左眼数据流和右眼数据流的第一子像素组和第二子像素组的排布。坐标系的确立可以根据需要进行调整,在此不做限定。
在步骤S130中,根据人眼位置、显示模组的参数在坐标系中的坐标,确定左眼视图和右眼视图对应的周期单元的属性参数,周期单元的属性参数包括周期单元的宽度和周期单元的边界的横坐标,其中,周期单元的宽度为左眼视图和右眼视图中相邻左眼视图和右眼视图的宽度之和。
如上所述,本公开以显示模组包括多组棱镜以及液晶屏作为示例进行描述,其中,液晶屏相对于棱镜远离观看者,通常观看者通过棱镜获得液晶屏的显示画面。
根据上述实施例,可以知道,显示模组的参数包括显示模组中棱镜的第一宽度和显示模组中棱镜表面至液晶屏的第一距离等相关的物理属性。本实施例中的棱镜的第一宽度为单个棱镜的宽度。如图2所示,棱镜的第一宽度P属于显示模组在设计生产制造时已经确定的属性,可以直接获取。类似地,棱镜表面至液晶屏的第一距离H也是可以根据显示模组的设计参数直接得到,其中,第一距离H具体为棱镜远离液晶屏一侧的表面与液晶屏表面之间的最小距离,即第一距离H也可以直接根据显示模组的物理属性得到。
本实施例中,周期单元为用于实现3D显示的一帧图像所划分的分布在显示模组的液晶屏上多个左眼视图和右眼视图中相邻左眼视图和右眼视图构成的最小单元。沿着液晶屏的宽度方向,左眼视图和右眼视图交替排布且多个周期单元依次排布,构成显示模组的液晶屏的左眼视图和右眼视图排布,以用于产生裸眼3D显示效果。即,左眼视图和右眼视图是为了实现3D显示效果而通过液晶屏上的第一子像素组和第二子像素组对应输出左眼数据流和右眼数据流中一帧显示画面所划分的左影像和右影像。也就是说,周期单元为左眼视图和右眼视图中相邻左眼视图和右眼视图构成的最小单元,周期单元至少对应显示屏 上的一个第一子像素和一个第二子像素。
从观看者视觉的角度而言,其观看到的内容实际为左眼视图和右眼视图。对于左眼视图和右眼视图而言,确定左眼视图和右眼视图的排布是通过周期单元以及周期单元的边界来确定的,其排布与显示模组中液晶屏的宽度有关,也可以说跟宽度方向上的子像素排布有关。可以直接以周期单元的边界的横坐标为起点,以周期单元的宽度对液晶屏进行宽度方向上的分割,从而能够得到左眼视图和右眼视图在显示屏上的排布。其中,周期单元的边界可以为液晶屏边缘,也可以是两个周期单元之间的交界处的边界。
根据人眼位置、显示模组的参数在坐标系中的坐标,确定左眼视图和右眼视图对应的周期单元的属性参数,即确定出周期单元的宽度和周期单元的边界的横坐标。
图2示出在坐标系中实现根据本申请实施例的空间现实显示方法的示意图。如图2所示,为了便于周期单元的宽度和周期单元的边界的横坐标的确定,本实施例中,以屏幕正中心(3D模组上表面的正中心)为原点,垂直于棱镜表面方向为Z轴,以棱镜上表面的宽度方向为X轴,以棱镜的长度方向为Y轴,建立三维坐标系。在该三维坐标系中确定人眼位置的坐标。
左眼视图和右眼视图的适应性排布使得观看者在当前的视角下,左眼能够看到左眼数据流输出的左眼视图(图3中左眼视图和右眼视图的黑色部分为左眼视图),右眼能够看到右眼数据流输出的右眼视图(左眼视图和右眼视图的白色部分为右眼视图)。可以理解的是,左眼视图和右眼视图为实现裸眼3D显示效果而相对于显示屏对每帧图像进行划分产生的。在观察者的视角发生改变的情况下,左眼视图和右眼视图在显示屏上的排布也会发生改变,从而能够确保观察者无论在哪个角度都能够观实现良好的裸眼3D显示效果。左眼视图和右眼视图在本申请中作为显示效果图。由于左眼视图和右眼视图是通过第一子像素和第二子像素构建的,其中左眼视图是第一子像素在输入左眼数据流的情况下构建的,右眼视图是第二子像素在输入右眼数据的情况下构建的。基于 左眼视图和右眼视图,可以确定第一子像素组和第二子像素组。通过人眼位置的左眼位置坐标ML(XL,ZL)和右眼位置坐标MR(XR,ZR),根据如下公式(1)和公式(2),计算出左眼和右眼连线的中点的坐标M1(X1,Z1);人眼位置坐标中已经包括了左眼和右眼连线的中点的坐标M1(X1,Z1)的方式也是可行的。

根据第一宽度、第一距离和人眼位置坐标,确定左眼视图和右眼视图的周期单元的宽度具体为:
如图2所示,在基于棱镜的3D显示模组中,为了获得裸眼3D显示效果,棱镜的宽度对应于液晶屏上显示的左眼视图和右眼视图的周期单元的宽度。如图2所示,一个周期单元由相邻的最小单元的左眼视图和最小单元的右眼视图构成。
如图2所示,具体地,周期单元的宽度可以通过如下方式获得:将左眼和右眼连线的中点M1分别与一个棱镜的两端连接形成两条连线,并将两条连线延长至液晶屏的表面,形成与液晶屏的两个交点M2和M3,液晶屏显示的画面中这两个交点M2和M3之间的部分即为左眼视图和右眼视图的一个周期单元,即左眼视图和右眼视图分布的最小单元。一帧显示画面的多个左眼视图和右眼视图是以周期单元为单位在显示屏的宽度方向上重复排布形成的,也就是说,根据左眼视图和右眼视图在显示屏上的排布将显示屏上的子像素划分为对应第一子像素组和第二子像素组,向第一子像素组内的各第一子像素输入左眼数据流,向第二子像素组内的各第二子像素输入右眼数据流,就可以构成该左眼视图和右眼视图。因此,通过确定两个交点M2和M3之间的距离,即可以确定左眼视图和右眼视图的一个周期单元的宽度。根据显示模组的物理特性,棱镜的上表面与液晶屏的上表面一般是保持平行的,当然有可能存在可以忽略的误 差。具体地,通过左眼和右眼连线的中点分别与一个棱镜的两端连线构成了一个三角形,而左眼和右眼连线的中点与交点M2、交点M3构成了另一个三角形,由于棱镜的上表面与液晶屏的上表面一般是保持平行的,即两个三角形的底边平行,因此,两个三角形构成了相似三角形。
基于相似三角形,连同棱镜的第一宽度P、左眼和右眼连线的中点的坐标M1(X1,Z1)以及棱镜表面至液晶屏的第一距离H,通过如下公式(3)和由公式(3)转换的公式(4),可以计算获取左眼视图和右眼视图的一个周期单元的宽度△X;

通过棱镜的第一宽度P、左眼和右眼连线的中点的坐标M1(X1,Z1)以及棱镜表面至液晶屏的第一距离H以及左眼视图和右眼视图的周期单元的宽度△X,可以通过计算确定出周期单元的边界的横坐标,即交点M2的横坐标。在坐标系中,M2的横坐标即为M2的X轴坐标。
在本实施例中,为了方便对交点M2的横坐标的计算,将棱镜的一个端部设定为显示屏的表面的中心位置,即三维坐标系的原点位置,此时该端部的坐标为(0,0)。交点M2的Z轴坐标Z2为H,此时交点M2的坐标(M2的横坐标和M2的Z轴坐标)以及坐标系原点构成了一个三角形。而左眼和右眼连线的中点M1的X轴坐标、中点M1的Z轴坐标与坐标系原点构成了另一个三角形,且由于棱镜的上表面与液晶屏的上表面一般是保持平行的,即两个三角形的底边平行,因此,两个三角形构成了相似三角形。
基于相似三角形,左眼和右眼连线的中点坐标M1(X1,Z1)以及棱镜表面至液晶屏的第一距离H(即M2的Z轴坐标Z2为H),通过如下公式(5)和由公式(5)转换的公式(6),可以计算获取确定出周期单元的边界的横坐标X2:

在确定了周期单元的宽度和周期单元的边界的横坐标情况下,根据液晶屏的宽度,能够确定出一帧图像的全部左眼视图和右眼视图的具体排布。
S140:根据显示模组的参数、左眼视图和右眼视图的周期单元的宽度和周期单元的边界的横坐标,确定第一子像素组和第二子像素组,其中,第一子像素组中的第一子像素为显示模组中输出左眼数据流的子像素,第二子像素组中的第二子像素为在显示模组中输出右眼数据流的子像素。
由于左眼视图和右眼视图中的左眼视图是第一子像素组输出左眼数据流的情况下生成的,右眼视图是第二子像素组输出右眼数据流的情况下生成的,周期单元的边界两侧分别对应一个第一子像素和一个第二子像素。根据液晶屏的宽度和周期单元的宽度可以将沿着液晶屏宽度方向排布的每个周期单元的边界的横坐标确定出来。由于液晶屏的每个子像素的位置是确定的,通过所确定的每个周期单元的边界的横坐标,用每个子像素的中心位置坐标除以周期单元的宽度来确定出该子像素对应的是哪个周期单元,同时能够确定余数。进一步地,通过判断余数,可以确定该子像素为第一子像素还是第二子像素。具体地,如果该子像素余数大于二分之一周期单元的宽度,则确定为第二子像素;若该子像素余数小于等于二分之一周期单元的宽度,则确定为第一子像素。基于此,能够准确地确定出第一子像素和第二子像素,无需进行物理调整。即,只需要通过控制左眼数据流输入到第一子像素组,右眼数据流输入到第二子像素组,观看者即可以实现良好的裸眼3D观看效果。
其中,本实施例中确定的第一子像素组为各第一子像素的集合,第二子像素组也同样为各第二子像素的集合。
在本实施例中,可以确定出第一子像素组和第二子像素组,基于第一子像素组和第二子像素组输出对应的左眼数据流和右眼数据流,使得观看者可以清 楚地观看到裸眼3D显示效果。而且,在本公开中,只需要采集到人眼位置,进一步确定人眼位置在坐标系中的坐标,其可以包括左眼三维坐标和右眼三维坐标。由于基于人眼位置的改变而确定第一子像素和第二子像素的重排的过程中只涉及Z轴以及X轴的坐标,因此无需获取人眼位置的Y轴坐标值,只需要获取人眼位置的X轴和Z轴的坐标值即可。即,在本公开中,仅仅需要获得人眼位置坐标中左眼的X轴和Z轴的坐标值、右眼的X轴和Z轴的坐标值。另外,在计算过程中采用的是左眼和右眼连线的中点的坐标,即人眼位置坐标可以仅仅指的是左眼和右眼连线的中点的三维坐标,或者左眼和右眼连线的中点的X轴坐标和Z轴坐标。
通过人眼位置坐标,确定出在显示模组中输出左眼数据流的第一子像素组,以及确定出在显示模组中输出右眼数据流的第二子像素组。通过控制左眼数据流在第一子像素组内输出,控制右眼数据流在第二子像素组内输出,从而能够生成相对左眼的第一图像和相对右眼的第二图像,进而能够根据人眼位置带来较高的裸眼3D视觉效果,提升观看者的观看体验。
在观看者调换视角或者位置的时候,重新确定人眼位置坐标,再根据人眼位置坐标来重新确定第一子像素组和第二子像素组,控制左眼数据流在第一子像素组内输出,控制右眼数据流在第二子像素内组输出,重新生成能够生成相对左眼的第一图像和相对右眼的第二图像。实现了对观看者调换位置或者视角的情况下,对3D显示模组内的子像素进行逻辑重排,重新适应观看者的人眼位置,使得观看者可以清楚地观看到3D画面,无需额外进行显示系统的物理调整,避免了物理调整造成的误差,同时也能够提升其显示效果,更加有效地提升了用户体验。
在一些实施例中,显示模组的参数包括显示模组上表面的中心、显示模组上表面的垂直方向以及显示模组的宽度方向;根据显示模组的参数,确定坐标系包括:
以显示模组上表面的中心为原点、以显示模组上表面的垂直方向为第一坐 标轴方向以及以显示模组的宽度方向为第二坐标轴方向,确定坐标系。
根据上述实施例可以知道,显示模组的参数可以包括显示模组上表面的中心、显示模组上表面的垂直方向以及显示模组的宽度方向等显示模组的物理属性,也还可以包括显示模组的长度方向等的参数。
在本实施例中,由于确定第一子像素和第二子像素的像素重排的过程只涉及Z轴以及X轴的坐标,因此无需获取人眼位置坐标中的Y轴坐标值,只需要获取人眼位置坐标的X轴和Z轴的坐标值,即人眼位置坐标左眼的X轴和Z轴的坐标值、右眼的X轴和Z轴的坐标值。
以显示模组上表面的中心(3D模组上表面的正中心)为原点,以显示模组上表面的垂直方向为Z轴方向,以显示模组宽度方向为X轴方向,建立二维坐标系,将人眼位置坐标转换到该二维坐标系中。从而能够方便地计算周期单元的属性参数中的周期单元的宽度和周期单元的边界的横坐标,降低运算量,提升对3D内容处理的效率。
当然还可以在上述二维坐标系的基础上,以显示模组的长度方向为Y轴方向,建立三维坐标系。在此不再赘述。
在一些实施例中,显示模组的参数包括显示模组中棱镜的第一宽度和显示模组中棱镜表面至液晶屏的第一距离;根据人眼位置坐标、显示模组的参数和坐标系,确定左眼视图和右眼视图的周期单元的宽度和周期单元的边界的横坐标包括:
根据第一宽度、第一距离、人眼位置坐标以及坐标系,确定周期单元的宽度和周期单元的边界的横坐标。
其中,本实施例中的棱镜的第一宽度为单个棱镜的宽度,棱镜的第一宽度P可以直接根据棱镜的设计确定,即属于显示模组在设计生产制造时已经确定的属性,可以直接读取得到。类似地,棱镜表面至液晶屏的第一距离H也可以直接根据显示模组的设计得到,其中,第一距离H具体为棱镜远离液晶屏一侧的表面与液晶屏表面之间的最小距离,即第一距离H也可以直接根据显示模组 特性得到。
本实施例中,通过上述实施例的方式建立坐标系,坐标系是二维或者是三维的坐标系。将人眼位置坐标转换到该坐标系中。根据人眼位置坐标、第一距离和第一宽度,通过相似三角形计算出在显示模组上输出左眼数据流的左眼视图和输出右眼数据流的右眼视图的周期单元。在左眼视图和右眼视图中,任一个左眼视图对应于至少一个输出左眼数据流的第一子像素,任一个右眼视图对应于至少一个输出右眼数据流的第二子像素。基于第一子像素和第二子像素输出对应的左眼数据流和右眼数据流,使得观看者可以清楚地观看到裸眼3D的效果。
本实施例中,显示模组中棱镜的第一宽度和显示模组中棱镜表面至液晶屏的第一距离属于显示模组的物理属性,能够容易获取到。通过本申请实施例,能够方便快速地计算确定周期单元的宽度和周期单元的边界的横坐标,从而能够容易地获得用于显示左眼视图的第一子像素组和用于显示右眼视图的第二子像素组,无需引入其他的变量,减少了系统误差的情况。
如图3所示,在一些实施例中,显示模组的参数包括显示模组中液晶屏的第二宽度,根据显示模组的参数、左眼视图和右眼视图的周期单元的宽度和周期单元的边界的横坐标,确定第一子像素组和第二子像素组包括:
S310:获取显示模组中液晶屏的第二宽度,根据周期单元的宽度、周期单元的边界的横坐标以及第二宽度,确定液晶屏的第一分布余量;第一分布余量可以视为液晶屏上与整数倍个周期单元对应后剩余的剩余部分,该剩余部分在X轴上的宽度小于一个周期单元的宽度,而去除剩余部分后,液晶屏其他部分的宽度为周期单元的宽度的整数倍;
S320:根据第一分布余量和周期单元的宽度,确定第一子像素组和第二子像素组。
显示模组包括显示模组中液晶屏的第二宽度,其中,显示模组中液晶屏的第二宽度为液晶屏的整体宽度,即液晶屏水平方向的整体宽度,例如沿着图2 所示的X轴方向的宽度。根据第一分布余量和周期单元的宽度,确定第一子像素组和第二子像素组。在观看者处于与3D显示模组上表面的正中心处的垂直方向成一定角度的情况下,由于显示模组的结构固定的,即液晶屏的像素排布是确定的,因此,在确定了根据周期单元的宽度和周期单元的边界的横坐标后,通过周期单元的宽度计算周期单元的边界的横坐标时,可能会存在超出液晶屏的边缘的情况,即液晶屏的宽度不能够被均分为若干个周期单元,被若干个周期单元均匀排布,因此液晶屏的边缘部分存在没有左眼视图和右眼视图的余量,即第一分布余量。此时需要考虑对于第一子像素和第二子像素的影响。如果通过周期单元的宽度与周期单元的边界的横坐标确定液晶屏完全被周期单元排布,也就是说,整个液晶屏的所有子像素都能够生成对应的左眼视图和右眼视图,则无需考虑余量问题。
液晶屏的第二宽度W是属于液晶屏的整体宽度,是属于显示模组的物理属性,可以直接获取到。
根据周期单元的宽度、周期单元的边界的横坐标以及第二宽度确定液晶屏的第一分布余量。由于左眼视图和右眼视图的周期单元的排布是以计算确定的周期单元的边界的横坐标为起点,根据周期单元的宽度向两侧依次排布的。在该实施例中,需要考虑周期单元的边界的横坐标对第一分布余量△B的影响。通常而言,液晶屏的边缘部分存在左眼视图和右眼视图的余量,即液晶屏的边缘中存在一部分,该部分在上述依次排布周期单元的宽度后无法覆盖一个周期单元的宽度,例如可能覆盖半个周期单元的宽度,或者四分之一个周期单元的宽度,这部分余量在本申请中称为第一分布余量。在一个实施例中,对于确定第一子像素和第二子像素而言,只需要考虑液晶屏一侧的边缘存在第一分布余量即可,这样液晶屏上尽可能多的子像素参与显示。因此,本公开中以在液晶屏一侧的边缘存在第一分布余量△B为例来讨论如何计算第一分布余量。其结果可以根据公式(7)得到:
△B=MOD(X2+W/2,△X)      (7)
其中,X2为周期单元的边界的横坐标,△X为周期单元的宽度,W为液晶屏的第二宽度。
通过公式(7)可以求得第一分布余量△B。具体地,确定了每个周期单元后,由于液晶屏的每个子像素的位置是确定的,对每个子像素的宽度位置加上第一分布余量除以周期单元的宽度,既可以确定出该子像素对应的是哪个周期单元,同时能够确定余数;通过判断余数,可以进一步确定该子像素为第一子像素还是第二子像素。具体地,如果该子像素余数大于二分之一周期单元的宽度,则确定为第二子像素;若该子像素余数小于等于二分之一周期单元的宽度,则确定为第一子像素。基于此,能够准确地确定出第一子像素和第二子像素,无需进行物理调整,只需要通过控制左眼数据流输入到第一子像素组,右眼数据流输入到第二子像素组,观看者即可以实现裸眼3D显示效果。
如图4所示,在一些实施例中,显示模组的参数还包括显示模组中液晶屏的各子像素的中点与液晶屏的边缘的第二距离;根据第一分布余量和周期单元的宽度,确定第一子像素组和第二子像素组包括:
S410:根据第二距离、第一分布余量和周期单元的宽度,确定子像素的第二分布余量;
S420:在子像素的第二分布余量小于或等于二分之一的周期单元的宽度的情况下,确定该子像素为第一子像素组中的第一子像素;
S430:在子像素的第二分布余量大于二分之一的周期单元的情况下,确定该子像素为第一子像素组中的第一子像素。
对于显示模组而言,液晶屏上的每个像素的物理位置是确定的。在本公开中,实现子像素重排指的是对子像素输入左眼视图还是输入右眼视图的输出内容进行调整,例如,将一个子像素作为显示左眼视图的第一子像素还是显示右眼视图的第二子像素。在本公开实施例中,对液晶屏中的子像素进行重排,能够生成适应观看者人眼位置裸眼3D效果,实质上改变的也是每个子像素输出的图像内容。
在本公开实施例中,需要先确定液晶屏上存在第一分布余量的边缘,然后确定每个子像素相对于该液晶屏边缘的位置,进而确定当前的子像素作为第一子像素输出左眼数据流还是作为第二子像素输出右眼数据流。一个像素中通常包括三种子像素,分别红色子像素R、绿色子像素G和蓝色子像素B,每个子像素的中点与液晶屏的边缘的第二距离是通过如下公式(8)-(10)确定的。


其中,△RJ为红色子像素的中点与液晶屏的边缘的第二距离;△GJ为绿色子像素的中点与液晶屏的边缘的第二距离;△BJ为蓝色子像素的中点与液晶屏的边缘的第二距离;J代表的是第几列的像素;N为像素宽度,通常一个像素内包括有并排设置的两个红色子像素,两个绿色子像素和两个蓝色子像素,因此,根据像素内的子像素的排列,红色子像素减去绿色子像素需要减去蓝色子像素需要减去
在确定了各子像素的中点与液晶屏的边缘的第二距离后,需要对各子像素的中点与液晶屏的边缘的第二距离与周期单元的宽度商的余数,由于存在上述实施例中指出的第一分布余量的存在,在此,需要考虑第一分布余量。根据如下公式(11)-(13)可以确定第二分布余量。
Position R=MOD(△RJ+△X-△B,△X)    (11)
Position G=MOD(△GJ+△X-△B,△X)    (12)
Position B=MOD(△BJ+△X-△B,△X)    (13)
其中,Position R为红色子像素的第二分布余量;Position G为绿色子像素的第二分布余量;Position B为蓝色子像素的第二分布余量;△RJ为红色子像 素的中点与液晶屏的边缘的第二距离;△GJ为绿色子像素的中点与液晶屏的边缘的第二距离;△BJ为蓝色子像素的中点与液晶屏的边缘的第二距离;△X为周期单元的宽度;△B为第一分布余量。公式(11)-(13)中,在进行求余之前,除数均增加了一个△X,以避免在计算过程中出现负数。
在确定了各子像素的第二分布余量后,根据如下确定各子像素属于第一子像素还是第二子像素:
的情况下,确定该红色子像素为第一子像素,用于输入左眼数据流;
的情况下,确定该红色子像素为第二子像素,用于输入右眼数据流;
的情况下,确定该蓝色子像素为第一子像素,用于输入左眼数据流;
的情况下,确定该蓝色子像素为第二子像素,用于输入右眼数据流;
的情况下,确定该绿色子像素为第一子像素,用于输入左眼数据流;
的情况下,确定该绿色子像素为第一子像素,用于输入右眼数据流。
根据上述方式能够准确地确定出第一子像素和第二子像素,无需进行显示模组的物理调整,只需要通过控制左眼数据流输入到第一子像素,右眼数据流输入到第二子像素,观看者即可以实现裸眼3D观看该3D内容。
在一些实施例中,该空间现实显示方法还包括:
获取视频信号;
根据视频信号,确定左眼数据流和右眼数据流。
视频信号可以为3D内容的信号,该3D内容可以在PC(Personal Computer,个人计算机)上生成,例如可以在3D软件(如UE/Unity)上生成。在一些实施例中,视频信号例如也可以由笔记本电脑、3D内容生成处理器或者云端服务器上处理生成,甚至可以在一些移动终端上生成。本公开对于3D内容的生成方式不做限定。生成视频信号的设备可以为显示系统上设备;也可以是显示系统外设备,其通过将生成的视频信号发送给显示系统,使得视频信号的分辨率为7680X4320@60Hz,视频信号可以为3D-SideBySide(并列型3D信号)模式或3D-FrameByFrame(时序型3D信号)模式。具体可以通过输出接口HDMI2.0X4输出到显示系统。输出的视频信号中可以包括视频分辨率7680X4320@60Hz的视频图像。相对应地,显示系统上设有与输出接口相对应的电连接的HDMI_RX模块,HDMI_RX模块包括HDMI 2.0X4的接口来接收视频信号。
根据视频信号,能够确定左眼数据流和右眼数据流,其中,左眼数据流用于控制3D显示模组输出适于左眼观看的视频流(例如左眼视觉图像),右眼数据流用于控制3D显示模组输出适于右眼观看的视频流(例如右眼视觉图像),以便形成裸眼的3D视觉影像。
在一些实施例中,如图5和图6所示,根据视频信号,确定左眼数据流和右眼数据流包括:
S510:在视频信号的模式为第一模式的情况下,生成第一视频流并将第一视频流存储于第一空间和生成第二视频流并将第二视频流存储于第二空间,第一视频流和第二视频流为不同步的视频流;
S520:在指定周期内重复从第一空间内读取第一视频流,生成左眼数据流;
S530:在指定周期内重复从第二空间内读取第二视频流,生成右眼数据流。
在图6中,l表示第一视频流中单个帧数的图像,r表示第二视频流单个帧数的图像,L表示左眼数据流中单个帧数的图像,R表示右眼数据流单个帧数的图像。
可以通过人工输入或者对视频信号进行解析来确定视频信号为第一模式还是第二模式。
在第一模式下,第一视频流和第二视频流为不同步的视频流,例如对于时序不同步的3D-FrameByFrame模式的视频信号,由于视频信号中左眼视频流和右眼视频流不同步,导致生成裸眼3D图像较差。
在显示系统的存储空间中开辟出第一空间和第二空间,第一空间为奇数帧地址空间,第二空间为偶数帧地址空间;反之亦然,第一空间为偶数帧地址空间,第二空间为奇数帧地址空间。其中,第一空间的大小和第二空间的大小基于3D显示模组的物理分辨率确定。
在本实施例中,第一视频流存储于第一空间,第二视频流存储于第二空间,即奇数帧地址空间用来存储左眼数据,偶数帧地址空间用来存储右眼数据。具体地,在显示系统中,通过将左眼视频流存入奇数帧地址空间中,将右眼视频流存入偶数帧地址空间中。
在同步读取第一空间和第二空间内的视频流时,确保奇数帧地址空间和偶数帧地址空间存在完整的帧数据。
在指定周期内重复从第一空间内读取第一视频流,生成左眼数据流;在指定周期内重复从第二空间内读取第二视频流,生成右眼数据流。指定周期可以为2帧的周期时间,或者4帧的周期时间,当然指定周期也可以根据实际情况进行设定。通过在指定周期内重复读取第一视频流和第二视频流,生成左眼数据流和右眼数据流,能够使得不同步的第一视频流和第二视频流在指定周期内被均衡。即,在指定周期内,读取出同一帧的第一视频流和第二视频流,来生成同步的左眼数据流和右眼数据流。输出同步的左眼数据流和右眼数据流可以避免生成较差的裸眼3D图像。
在一些实施例中,根据视频信号,确定左眼数据流和右眼数据流包括:
在确定视频信号的模式为第二模式的情况下,生成第三视频流和第四视频流,第二模式下第三视频流和第四视频流为同步的视频流;
对第三视频流和第四视频流进行指定倍率的拉伸,生成左眼数据流和右眼数据流,其中,指定倍率大于等于2。
由于视频信号中包括同步的第三视频流和第四视频流,通过解析视频信号既可以得到同步的第三视频流和第四视频流。
第二模式下第三视频流和第四视频流为并列同步的视频流。例如,时序相同的3D-SideBySide(并列型3D信号)模式的视频信号,该视频信号的大小受到生成3D内容的设备与显示系统之间的输入接口和输出接口之间的约束,一般为8K分辨率的视频。此时将视频信号分割为左眼视频流和右眼视频流,会导致生成的左眼数据流和右眼数据流均为小于等于4K分辨率的数据流,这样的数据流输出到3D显示模组中,显示的效果会较差,显示的分辨率较低,造成裸眼3D的观看体验较差。
在处理视频信号时,将视频信号分割为左眼视频流和右眼视频流后,由于在显示系统中没有带宽的影响,此时的左眼视频流和右眼视频流之和可以大于8K。因此,可以对第三视频流和第四视频流进行指定倍率的拉伸,左眼数据流和右眼数据流经过水平拉伸后,其分辨率得到有效地提升,从而能够使得在3D显示模组上输出较为清晰的帧图像,进而能够确保裸眼3D图像的显示效果。
在一些实施例中,视频信号识别为第三模式的视频信号,第三模式的视频信号为2D的视频流,此时可以将2D视频流直接输出到3D显示模组上。
在一些实施例中,该空间现实显示方法还可以包括:
获取左眼三维坐标和右眼三维坐标;
基于左眼三维坐标和右眼三维坐标,确定视频信号。
在本实施例中,视频信号的生成可以通过显示系统来完成,即在其他的3D内容生成的设备上生成3D内容,然后将生成的3D内容发送至显示系统。显示系统从视觉识别装置中获取的左眼三维坐标和右眼三维坐标。以屏幕正中心(3D显示模组上表面的正中心)为原点,垂直于棱镜表面方向为Z轴,以棱镜上表面的宽度方向为X轴,以棱镜的长度方向为Y轴,建立三维坐标系。根据 左眼三维坐标和右眼三维坐标调整输出的3D内容,从而生成视频信号。
生成3D内容的设备也可以作为显示系统的一部分。从视觉识别装置中获取左眼三维坐标和右眼三维坐标,并根据左眼三维坐标和右眼三维坐标生成3D内容作为视频信息。
在本实施例中,通过引入左眼三维坐标和右眼三维坐标来生成对应的3D内容,能够根据观看者的视角调整3D内容,进而调整用于显示3D内容的第一子像素和第二子像素,以使得观看者能够体验到更好的裸眼3D的视觉效果。
作为本申请实施例的另一个方面,本申请实施例提供一种空间现实显示系统。如图7所示,本申请实施例的空间现实显示系统700用于执行本申请实施例的空间现实显示方法。
显示系统700可以为裸眼3D的显示模组,例如通过显示模组内的处理器、处理芯片等执行该空间现实显示方法,并在显示模组中显示。显示系统也可以包括另外的控制器或处理器,在显示系统700上执行了空间现实显示方法后,控制裸眼3D的显示模组通过第一子像素组输出左眼数据流和通过第二子像素组输出右眼数据流。
本实施例的空间现实显示系统,通过跟踪人眼位置坐标,实时确定出显示模组中液晶屏的第一子像素和第二子像素,通过在第一子像素中输出左眼数据流和在第二子像素中输出右眼数据流,能够实时地根据人眼位置坐标在第一子像素内输出左眼数据流,在第二子像素内输出右眼数据流,即实现了根据人眼的坐标位置对显示模组内的子像素进行了重新分布,以适配观看者的人眼位置。使得观看者在不同角度或不同位置都能够清晰地观看到裸眼3D,而无需对显示屏进行移动,避免了因为移动显示屏造成难以准确定位到人眼位置而导致的裸眼3D效果较差的问题,以便能够带来较高的裸眼3D视觉效果,提升观看者的观看体验。
在一些实施例中,显示系统700包括:
显示装置710,包括用于进行3D显示的显示模组,其包括多个子像素,以及
信号输出装置720,与显示装置710电连接,用于将基于人眼位置生成的左眼视频流和右眼视频率分别传输至显示装置710中的第一像素组和第二像素组。
信号输出装置720例如为PC(Personal Computer,个人计算机)、笔记本电脑、移动终端、3D内容生成处理器或者是云端服务器。例如:在PC的3D软件上生成的3D内容,例如在UE/Unity上生成的3D内容,基于3D内容来生成3D视频信号。
上述的显示装置710可以集成有存储器和处理器。因此,本公开上述的空间现实显示方法可以作为程序指令存储在存储器中,通过处理器来执行存储在存储器中的程序指令实施空间显示显示方法,具体地,处理器基于信号输出装置720输出的左眼视频流和右眼视频率以及存储在存储器中的显示装置710所包括的3D显示模组的物理参数来确定显示模组的显示屏上用于接收左眼视频流的第一子像素组和用于接收右眼视频率的第二子像素组来实现3D显示。
在一些实施例中,显示系统700还包括:
人眼识别装置730,其电连接至显示装置710和信号输出装置720,用于向显示装置710输出人眼位置坐标和向信号输出装置720输出人眼位置坐标。
人眼识别装置730可以为一体式的视觉识别装置,通过自带的摄像头拍摄到人脸图像,通过例如训练好的神经网络等方式确定出人眼位置坐标中左眼三维坐标、右眼三维坐标,分别发送给显示装置710和信号输出装置720,以使得显示装置710和信号输出装置720能够执行上述的空间现实显示方法。
在一些实施例中,人眼识别装置730还包括:
采集单元731,用于采集人眼信息;
识别单元732,电连接到采集单元731、显示装置710和信号输出装置720,用于对人眼信息进行识别,确定人眼坐标信息,人眼坐标信息包括人眼位置坐 标、左眼三维坐标和右眼三维坐标,并将人眼位置坐标输出至显示装置710、将左眼三维坐标和右眼三维坐标输出至信号输出装置720。
采集单元731可以为相机、摄像头等,可以采用摄像头采集到人脸图像,识别单元732可以为眼部识别装置,例如SOC(System on Chip,系统级芯片)、训练好的神经网络等对人脸图像进行识别,确定出人眼位置坐标。也可以采用一体式的视觉识别装置,直接识别到人眼坐标信息,人眼坐标信息包括人眼位置坐标、左眼三维坐标和右眼三维坐标。通过分别向显示装置700发送人眼位置坐标以及向信号输出装置720发送左眼三维坐标和右眼三维坐标的方式,以使得显示装置700能够获取到人眼位置坐标,信号输出装置720能够获取到左眼三维坐标和右眼三维坐标,从而使得显示装置700和信号输出装置720能够执行对应的空间现实显示方法。
上述实施例的空间现实显示系统的其他构成可以采用于本领域普通技术人员现在和未来知悉的各种技术方案,这里不再详细描述。
在本说明书的描述中,需要理解的是,术语“中心”、“纵向”、“横向”、“长度”、“宽度”、“厚度”、“上”、“下”、“前”、“后”、“左”、“右”、“竖直”、“水平”、“顶”、“底”、“内”、“外”、“顺时针”、“逆时针”、“轴向”、“径向”、“周向”等指示的方位或位置关系为基于附图所示的方位或位置关系,仅是为了便于描述本申请和简化描述,而不是指示或暗示所指的装置或元件必须具有特定的方位、以特定的方位构造和操作,因此不能理解为对本申请的限制。
此外,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者多个该特征。在本申请的描述中,“多个”的含义是两个或两个以上,除非另有明确具体的限定。
在本申请中,除非另有明确的规定和限定,术语“安装”、“相连”、“连接”、“固定”等术语应做广义理解,例如,可以是固定连接,也可以是可拆 卸连接,或成一体;可以是机械连接,也可以是电连接,还可以是通信;可以是直接相连,也可以通过中间媒介间接相连,可以是两个元件内部的连通或两个元件的相互作用关系。对于本领域的普通技术人员而言,可以根据具体情况理解上述术语在本申请中的具体含义。
在本申请中,除非另有明确的规定和限定,第一特征在第二特征之“上”或之“下”可以包括第一和第二特征直接接触,也可以包括第一和第二特征不是直接接触而是通过它们之间的另外的特征接触。而且,第一特征在第二特征“之上”、“上方”和“上面”包括第一特征在第二特征正上方和斜上方,或仅仅表示第一特征水平高度高于第二特征。第一特征在第二特征“之下”、“下方”和“下面”包括第一特征在第二特征正上方和斜上方,或仅仅表示第一特征水平高度小于第二特征。
上文的公开提供了许多不同的实施方式或例子用来实现本申请的不同结构。为了简化本申请的公开,上文中对特定例子的部件和设置进行描述。当然,它们仅仅为示例,并且目的不在于限制本申请。此外,本申请可以在不同例子中重复参考数字和/或参考字母,这种重复是为了简化和清楚的目的,其本身不指示所讨论各种实施方式和/或设置之间的关系。流程图中或在此以其他方式描述的任何过程或方法描述可以被理解为,表示包括一个或更多个用于实现特定逻辑功能或过程的步骤的可执行指令的代码的模块、片段或部分。并且本申请的优选实施方式的范围包括另外的实现,其中可以不按所示出或讨论的顺序,包括根据所涉及的功能按基本同时的方式或按相反的顺序,来执行功能。
在流程图中表示或在此以其他方式描述的逻辑和/或步骤,例如,可以被认为是用于实现逻辑功能的可执行指令的定序列表,可以具体实现在任何计算机可读介质中,以供指令执行系统、装置或设备(如基于计算机的系统、包括处理器的系统或其他可以从指令执行系统、装置或设备取指令并执行指令的系统)使用,或结合这些指令执行系统、装置或设备而使用。
应理解的是,本申请的各部分可以用硬件、软件、固件或它们的组合来实 现。在上述实施方式中,多个步骤或方法可以用存储在存储器中且由合适的指令执行系统执行的软件或固件来实现。上述实施例方法的全部或部分步骤是可以通过程序来指令相关的硬件完成,该程序可以存储于一种计算机可读存储介质中,该程序在执行时,包括方法实施例的步骤之一或其组合。
此外,在本申请各个实施例中的各功能单元可以集成在一个处理模块中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。上述集成的模块如果以软件功能模块的形式实现并作为独立的产品销售或使用时,也可以存储在一个计算机可读存储介质中。该存储介质可以是只读存储器,磁盘或光盘等。
以上,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到其各种变化或替换,这些都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以权利要求的保护范围为准。

Claims (20)

  1. 一种空间现实显示方法,包括:
    获取人眼位置、包括多个子像素的显示模组的参数和用于实现3D显示的左眼数据流、右眼数据流;
    根据所述显示模组的参数,建立坐标系;
    根据所述人眼位置和所述显示模组的参数在所述坐标系中的坐标,确定用于实现3D显示的左眼视图和右眼视图对应的周期单元的属性参数,所述周期单元的属性参数包括周期单元的宽度和周期单元的边界的横坐标,其中,所述周期单元的宽度为左眼视图和右眼视图中相邻左眼视图和右眼视图的宽度之和;
    根据所述显示模组的参数、所述周期单元的宽度和所述周期单元的边界的横坐标,将所述多个子像素划分成第一子像素组和第二子像素组,其中,所述第一子像素组中的各第一子像素为显示模组中输出所述左眼数据流的子像素,所述第二子像素组中的各第二子像素为在所述显示模组中输出所述右眼数据流的子像素。
  2. 根据权利要求1所述的空间现实显示方法,其中,所述显示模组的参数包括显示模组上表面的中心、显示模组上表面的垂直方向以及显示模组的宽度方向;以及
    所述根据所述显示模组的参数,建立坐标系包括:
    以所述显示模组上表面的中心为原点、以所述显示模组上表面的垂直方向为第一坐标轴方向以及以所述显示模组的宽度方向为第二坐标轴方向,建立坐标系。
  3. 根据权利要求1所述的空间现实显示方法,其中,所述显示模组的参数 包括显示模组所包括的多个棱镜中的一个棱镜的第一宽度和所述显示模组中棱镜表面至液晶屏的第一距离;以及
    所述根据所述人眼位置和所述显示模组的参数在所述坐标系的坐标,确定周期单元的宽度和周期单元的边界的横坐标包括:
    根据所述第一宽度、所述第一距离、所述人眼位置在所述坐标系的坐标,确定左眼视图和右眼视图的周期单元的宽度和周期单元的边界的横坐标。
  4. 根据权利要求1所述的空间现实显示方法,其中,所述显示模组的参数包括显示模组中液晶屏的第二宽度,以及
    所述根据所述显示模组的参数、所述周期单元的宽度和所述周期单元的边界的横坐标,确定第一子像素组和第二子像素组包括:
    获取所述显示模组中液晶屏的第二宽度;
    根据所述周期单元的宽度、所述周期单元的边界的横坐标以及所述第二宽度,确定所述液晶屏的第一分布余量,其中,所述第一分布余量为所述液晶屏上与整数倍个周期单元对应后剩余的部分;
    根据所述第一分布余量和所述周期单元的宽度,确定第一子像素组和第二子像素组。
  5. 根据权利要求4所述的空间现实显示方法,其中,所述第一分布余量处于所述液晶屏在所述第二坐标轴方向上的一个边缘。
  6. 根据权利要求5所述的空间现实显示方法,其中,所述显示模组的参数还包括显示模组中液晶屏的各子像素的中点与液晶屏的所述边缘的第二距离;以及
    所述根据所述第一分布余量和所述周期单元的宽度,确定第一子像素组和第二子像素组包括:
    根据所述第二距离、所述第一分布余量和所述周期单元的宽度,确定子像素的第二分布余量,所述第二分布余量为该子像素距离所述边缘的第二距离减去所述第一分布余量后的距离除以所述周期单元的宽度后得到的余数;
    在子像素的第二分布余量小于或等于二分之一的所述周期单元的宽度的情况下,确定该子像素为第一子像素组中的第一子像素;
    在子像素的第二分布余量大于二分之一的所述周期单元的宽度的情况下,确定该子像素为第二子像素组中的第二子像素。
  7. 根据权利要求1所述的空间现实显示方法,其中,所述方法在获取左眼数据流和右眼数据流之前还包括:
    获取视频信号;以及
    根据所述视频信号,确定所述左眼数据流和所述右眼数据流。
  8. 根据权利要求7所述的空间现实显示方法,其中,所述根据所述视频信号,确定左眼数据流和右眼数据流包括:
    在所述视频信号的模式为第一模式的情况下,生成第一视频流并将所述第一视频流存储于第一空间和生成第二视频流并将所述第二视频流存储于第二空间,所述第一模式的视频信号为所述第一视频流和所述第二视频流为不同步的视频流;
    在指定周期内重复从所述第一空间内读取所述第一视频流,生成左眼数据流;
    在指定周期内重复从所述第二空间内读取所述第二视频流,生成右眼数据流。
  9. 根据权利要求7所述的空间现实显示方法,其中,所述根据所述视频信号,确定左眼数据流和右眼数据流包括:
    在所述视频信号的模式为第二模式的情况下,生成第三视频流和第四视频流,所述第三视频流和所述第四视频流为并列同步的视频流;以及
    对所述第三视频流和所述第四视频流进行指定倍率的拉伸,生成左眼数据流和右眼数据流,其中,所述指定倍率大于等于2。
  10. 一种空间现实显示系统,包括显示模组、处理器和存储器,其中,所述显示模组包括多个子像素,所述处理器执行存储在所述存储器中的程序以执行包括以下步骤的方法:
    接收人眼位置、显示模组的物理参数、用于实现所述显示模组的3D显示的左眼数据流和右眼数据流;
    根据所述显示模组的物理参数,建立坐标系;
    根据所述人眼位置和所述显示模组的参数在所述坐标系中的坐标,确定用于实现所述显示模组的三维显示的左眼视图和右眼视图的周期单元的属性参数,所述周期单元的属性参数包括周期单元的宽度和周期单元的边界的横坐标,其中,所述周期单元的宽度为相邻的左眼视图和右眼视图的宽度之和;
    根据所述显示模组的参数、所述周期单元的宽度和所述周期单元的边界的横坐标,将所述多个子像素划分成第一子像素组和第二子像素组,其中,所述第一子像素组中的各第一子像素为显示模组中输出所述左眼数据流的子像素,所述第二子像素组中的各第二子像素为在所述显示模组中输出所述右眼数据流的子像素。
  11. 根据权利要求10所述的空间现实显示系统,还包括:
    信号输出装置,其用于接收视频信号并且连接至所述处理器;
    人眼识别装置,其电连接至所述处理器,并且进一步包括:
    采集单元,用于采集人眼位置信息;以及
    识别单元,其电连接至所述采集单元和所述处理器,用于对所述人眼位置 信息进行识别,确定人眼位置,并将所述人眼位置输出至所述处理器。
  12. 根据权利要求10所述的空间现实显示系统,其中,所述显示模组的参数包括显示模组上表面的中心、显示模组上表面的垂直方向以及显示模组的宽度方向;以及
    所述根据所述显示模组的参数,建立坐标系包括:
    以所述显示模组上表面的中心为原点、以所述显示模组上表面的垂直方向为第一坐标轴方向以及以所述显示模组的宽度方向为第二坐标轴方向,建立坐标系。
  13. 根据权利要求10所述的空间现实显示系统,其中,所述显示模组的参数包括显示模组所包括的多个棱镜中的一个棱镜的第一宽度和所述显示模组中棱镜表面至液晶屏的第一距离;以及
    所述根据所述人眼位置和所述显示模组的参数在所述坐标系的坐标,确定周期单元的宽度和周期单元的边界的横坐标包括:
    根据所述第一宽度、所述第一距离、所述人眼位置在所述坐标系的坐标,确定左眼视图和右眼视图的周期单元的宽度和周期单元的边界的横坐标。
  14. 根据权利要求10所述的空间现实显示系统,其中,所述显示模组的参数包括显示模组中液晶屏的第二宽度,以及
    所述根据所述显示模组的参数、所述周期单元的宽度和所述周期单元的边界的横坐标,确定第一子像素组和第二子像素组包括:
    获取所述显示模组中液晶屏的第二宽度;
    根据所述周期单元的宽度、所述周期单元的边界的横坐标以及所述第二宽度,确定所述液晶屏的第一分布余量,其中,所述第一分布余量为所述液晶屏上与整数倍个周期单元对应后剩余的部分;
    根据所述第一分布余量和所述周期单元的宽度,确定第一子像素组和第二子像素组。
  15. 根据权利要求14所述的空间现实显示系统,其中,所述第一分布余量处于所述液晶屏在所述第二坐标轴方向上的一个边缘。
  16. 根据权利要求15所述的空间现实显示系统,其中,所述显示模组的参数还包括显示模组中液晶屏的各子像素的中点与液晶屏的所述边缘的第二距离;以及
    所述根据所述第一分布余量和所述周期单元的宽度,确定第一子像素组和第二子像素组包括:
    根据所述第二距离、所述第一分布余量和所述周期单元的宽度,确定子像素的第二分布余量,所述第二分布余量为该子像素距离所述边缘的第二距离减去所述第一分布余量后的距离除以所述周期单元的宽度后得到的余数;
    在子像素的第二分布余量小于或等于二分之一的所述周期单元的宽度的情况下,确定该子像素为第一子像素组中的第一子像素;
    在子像素的第二分布余量大于二分之一的所述周期单元的宽度的情况下,确定该子像素为第二子像素组中的第二子像素。
  17. 一种非易失性计算机可读存储介质,其上存储有指令,当这些指令被处理器执行时,使得处理器执行包括以下步骤的方法:
    获取人眼位置、显示模组的参数和用于实现3D显示的左眼数据流、右眼数据流;
    根据所述显示模组的参数,建立坐标系;
    根据所述人眼位置和所述显示模组的参数在所述坐标系中的坐标,确定用于实现3D显示的左眼视图和右眼视图对应的周期单元的属性参数,所述周期 单元的属性参数包括周期单元的宽度和周期单元的边界的横坐标,其中,所述周期单元的宽度为左眼视图和右眼视图中相邻左眼视图和右眼视图的宽度之和;
    根据所述显示模组的参数、所述周期单元的宽度和所述周期单元的边界的横坐标,将所述多个子像素划分成第一子像素组和第二子像素组,其中,所述第一子像素组中的各第一子像素为显示模组中输出所述左眼数据流的子像素,所述第二子像素组中的各第二子像素为在所述显示模组中输出所述右眼数据流的子像素。
  18. 根据权利要求17所述的非易失性计算机可读存储介质,其中,所述显示模组的参数包括显示模组上表面的中心、显示模组上表面的垂直方向以及显示模组的宽度方向;以及
    所述根据所述显示模组的参数,建立坐标系包括:
    以所述显示模组上表面的中心为原点、以所述显示模组上表面的垂直方向为第一坐标轴方向以及以所述显示模组的宽度方向为第二坐标轴方向,建立坐标系。
  19. 根据权利要求17所述的非易失性计算机可读存储介质,其中,所述显示模组的参数包括显示模组所包括的多个棱镜中的一个棱镜的第一宽度和所述显示模组中棱镜表面至液晶屏的第一距离;以及
    所述根据所述人眼位置和所述显示模组的参数在所述坐标系的坐标,确定周期单元的宽度和周期单元的边界的横坐标包括:
    根据所述第一宽度、所述第一距离、所述人眼位置在所述坐标系的坐标,确定左眼视图和右眼视图的周期单元的宽度和周期单元的边界的横坐标。
  20. 根据权利要求17所述的非易失性计算机可读存储介质,其中,所述显 示模组的参数包括显示模组中液晶屏的第二宽度,以及
    所述根据所述显示模组的参数、所述周期单元的宽度和所述周期单元的边界的横坐标,确定第一子像素组和第二子像素组包括:
    获取所述显示模组中液晶屏的第二宽度;
    根据所述周期单元的宽度、所述周期单元的边界的横坐标以及所述第二宽度,确定所述液晶屏的第一分布余量,其中,所述第一分布余量为所述液晶屏上与整数倍个周期单元对应后剩余的部分;
    根据所述第一分布余量和所述周期单元的宽度,确定第一子像素组和第二子像素组。
PCT/CN2023/127651 2022-10-31 2023-10-30 空间现实显示方法、空间现实显示系统以及非易失性计算机可读存储介质 WO2024093893A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211351069.X 2022-10-31
CN202211351069.XA CN117998073A (zh) 2022-10-31 2022-10-31 空间现实显示方法及空间现实显示系统

Publications (1)

Publication Number Publication Date
WO2024093893A1 true WO2024093893A1 (zh) 2024-05-10

Family

ID=90889456

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/127651 WO2024093893A1 (zh) 2022-10-31 2023-10-30 空间现实显示方法、空间现实显示系统以及非易失性计算机可读存储介质

Country Status (2)

Country Link
CN (1) CN117998073A (zh)
WO (1) WO2024093893A1 (zh)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104202590A (zh) * 2014-06-19 2014-12-10 杭州立体世界科技有限公司 高清裸眼便携式立体影视播放器控制电路及转换方法
CN107172417A (zh) * 2017-06-30 2017-09-15 深圳超多维科技有限公司 一种裸眼3d屏幕的图像显示方法、装置及系统
CN108174182A (zh) * 2017-12-30 2018-06-15 上海易维视科技股份有限公司 三维跟踪式裸眼立体显示视区调整方法及显示系统
WO2022163728A1 (ja) * 2021-01-26 2022-08-04 京セラ株式会社 3次元表示装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104202590A (zh) * 2014-06-19 2014-12-10 杭州立体世界科技有限公司 高清裸眼便携式立体影视播放器控制电路及转换方法
CN107172417A (zh) * 2017-06-30 2017-09-15 深圳超多维科技有限公司 一种裸眼3d屏幕的图像显示方法、装置及系统
CN108174182A (zh) * 2017-12-30 2018-06-15 上海易维视科技股份有限公司 三维跟踪式裸眼立体显示视区调整方法及显示系统
WO2022163728A1 (ja) * 2021-01-26 2022-08-04 京セラ株式会社 3次元表示装置

Also Published As

Publication number Publication date
CN117998073A (zh) 2024-05-07

Similar Documents

Publication Publication Date Title
CN102428707B (zh) 立体视用图像对位装置和立体视用图像对位方法
AU2008204084B2 (en) Method and apparatus for generating stereoscopic image from two-dimensional image by using mesh map
RU2519518C2 (ru) Устройство генерирования стереоскопического изображения, способ генерирования стереоскопического изображения и программа
JP4827783B2 (ja) 画像表示装置
JP2009509398A (ja) 立体視フォーマット・コンバータ
WO2019184251A1 (en) Rendering method, computer product and display apparatus
US8723920B1 (en) Encoding process for multidimensional display
WO2016045425A1 (zh) 一种两视点立体图像合成方法及系统
CN102379126A (zh) 图像显示装置和方法以及程序
US20130293669A1 (en) System and method for eye alignment in video
TWI432013B (zh) 立體影像顯示方法及影像時序控制器
US20200029057A1 (en) Systems and methods for correcting color separation in field-sequential displays
US8872902B2 (en) Stereoscopic video processing device and method for modifying a parallax value, and program
TW201445977A (zh) 影像處理方法及影像處理系統
JP2011164781A (ja) 立体画像生成プログラム、情報記憶媒体、立体画像生成装置、及び立体画像生成方法
US20140071237A1 (en) Image processing device and method thereof, and program
JP2013065951A (ja) 表示装置、表示方法、及びプログラム
US20120120190A1 (en) Display device for use in a frame sequential 3d display system and related 3d display system
JP4634863B2 (ja) 立体視画像生成装置及び立体視画像生成プログラム
US7034819B2 (en) Apparatus and method for generating an interleaved stereo image
CN109922326B (zh) 裸眼3d视频图像的分辨率确定方法、装置、介质及设备
TWI462569B (zh) 三維影像攝相機及其相關控制方法
WO2024093893A1 (zh) 空间现实显示方法、空间现实显示系统以及非易失性计算机可读存储介质
JP2004102526A (ja) 立体画像表示装置、表示処理方法及び処理プログラム
Kang Wei et al. Three-dimensional scene navigation through anaglyphic panorama visualization