WO2020019548A1 - 基于人眼跟踪的裸眼3d显示方法、装置、设备和介质 - Google Patents

基于人眼跟踪的裸眼3d显示方法、装置、设备和介质 Download PDF

Info

Publication number
WO2020019548A1
WO2020019548A1 PCT/CN2018/111648 CN2018111648W WO2020019548A1 WO 2020019548 A1 WO2020019548 A1 WO 2020019548A1 CN 2018111648 W CN2018111648 W CN 2018111648W WO 2020019548 A1 WO2020019548 A1 WO 2020019548A1
Authority
WO
WIPO (PCT)
Prior art keywords
viewer
camera
eyes
angle
distance
Prior art date
Application number
PCT/CN2018/111648
Other languages
English (en)
French (fr)
Inventor
夏正国
Original Assignee
上海玮舟微电子科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 上海玮舟微电子科技有限公司 filed Critical 上海玮舟微电子科技有限公司
Publication of WO2020019548A1 publication Critical patent/WO2020019548A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/246Calibration of cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/383Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes

Definitions

  • Embodiments of the present disclosure relate to the field of three-dimensional (3D) display technology, for example, a naked-eye 3D display method, device, device, and medium based on human eye tracking.
  • 3D three-dimensional
  • the naked-eye 3D display technology refers to a 3D display technology in which a viewer can directly watch a three-dimensional image with the naked eye without wearing special 3D glasses.
  • the naked eye 3D display technology includes a lenticular grating, a slit grating, and a liquid crystal lens. The most widely used is the lenticular 3D display technology.
  • the principle of lenticular 3D display technology is realized by attaching a layer of special lenticular lens in front of the conventional display screen. The pixels of the image under each lenticular lens are divided into sub-pixels so that the lens can project each sub-pixel in a different direction.
  • the left and right eyes respectively see the light emitted by different sub-pixels, so that the left and right eyes of the person see different pictures, and merge into a 3D effect picture in the brain.
  • each lenticular lens projects the pixel content to the left and right eyes separately
  • the fixed light distribution without human eye tracking allows users to find a suitable viewing position in order to see the ideal stereo effect.
  • the light entering the left eye may enter the right eye.
  • the right eye can see both the left image and the right image, which is prone to crosstalk, and the user experience is poor. Therefore, it is necessary to track the position of the human eye in real time, and adjust the display content according to the collected position of the human eye.
  • human eye tracking includes front-back tracking and up-down-left-right tracking.
  • the camera set on the screen is used to capture an image containing both eyes of the viewer, and then the arrangement is calculated based on the relationship between the imaging distance of the two eyes in the image captured by the camera and the actual layout cycle width Graph cycle width.
  • the calibration of the above relationship is based on the premise of a fixed interpupillary distance of the viewer, which is generally 6.5 cm.
  • the actual situation is that the interpupillary distance of each person is not exactly 6.5 centimeters, and there are individual differences, so the width of the layout period calculated in the related art based on the imaging distance of both eyes in the image is not accurate.
  • Embodiments of the present disclosure provide a naked eye 3D display method, device, device, and medium based on human eye tracking, so as to track the user's forward and backward movement, and precisely adjust the layout cycle of displayed content.
  • a naked eye 3D display method based on human eye tracking is provided in an embodiment of the present disclosure.
  • the method includes:
  • the layout content is adjusted and displayed according to the width of the target layout period.
  • an embodiment of the present disclosure further provides a naked-eye 3D display device based on human eye tracking.
  • the device includes:
  • the current distance determining module is configured to determine the current distance between the viewer's eyes and the display according to the geometric relationship between the first and second cameras and the viewer's eyes provided on the display;
  • the layout period width determining module is configured to determine a width of a target layout period corresponding to the current distance based on a mapping relationship between a preset distance between the viewer's eyes and the display screen and the layout period width;
  • the layout content adjustment module is configured to adjust and display the layout content according to the width of the target layout period.
  • an embodiment of the present disclosure further provides a device, where the device includes:
  • One or more processors are One or more processors;
  • a storage device configured to store one or more programs
  • the one or more processors When the one or more programs are executed by the one or more processors, the one or more processors enable the naked eye 3D display method based on human eye tracking provided by any embodiment of the present disclosure.
  • the embodiment of the present disclosure further provides a computer-readable storage medium on which a computer program is stored.
  • the program is executed by a processor, the naked eye based on human eye tracking provided by any embodiment of the present disclosure is implemented. 3D display method.
  • the embodiment of the present disclosure implements adjusting the width of the target layout period according to the actual distance between the viewer's eyes and the display screen. Moreover, since the technical solution of the embodiment of the present disclosure only focuses on the actual distance of the human eye from the display screen, there is no need to calculate the three-dimensional coordinates of the human eye in space according to the transformation method of the spatial coordinate system provided by the related technology, but based on The geometric relationship between the camera and the human eye can directly calculate the actual distance, which greatly simplifies the calculation of the actual distance.
  • the method for directly determining the layout cycle width according to the actual distance between the human eye and the screen provided by the embodiment of the present disclosure is relative to the method for calculating the layout cycle width according to the imaging distance of human eyes in an image provided by the related art.
  • the technical solution of the embodiment does not need to use the imaging distance of the two eyes in the image to simulate the actual distance of the human eye from the display screen. Therefore, the technical solution of the embodiment of the present disclosure can be applied to users with any interpupillary distance without the limitation of individual differences.
  • the calculation accuracy of the layout cycle width has been improved. By using the calculated layout cycle width to adjust layout content in real time, users can experience better naked eye 3D effects.
  • FIG. 1a is a schematic diagram of a layout cycle in the present disclosure
  • FIG. 1b is a schematic diagram showing the relationship between the binocular forward and backward movement and the layout cycle width in the present disclosure
  • Embodiment 2 is a flowchart of a naked eye 3D display method based on human eye tracking provided by Embodiment 1 of the present disclosure
  • FIG. 3a is a schematic diagram of a geometric relationship between a first camera, a second camera, and a viewer's eyes according to Embodiment 1 of the present disclosure
  • FIG. 3b is a schematic diagram of a layout cycle adjustment method provided by Embodiment 1 of the present disclosure.
  • Embodiment 4 is a flowchart of a naked eye 3D display method based on human eye tracking provided in Embodiment 2 of the present disclosure
  • FIG. 5a is a schematic diagram of a positional relationship between a camera and a viewer's eyes provided in Embodiment 2 of the present disclosure
  • 5b is a geometric relationship diagram of a display screen viewed by a human eye provided in Embodiment 2 of the present disclosure
  • 5c is a schematic diagram of calibrating parameters of an inverse proportional function according to Embodiment 2 of the present disclosure
  • FIG. 5d is another schematic diagram for calibrating parameters of an inverse proportional function provided in Embodiment 2 of the present disclosure.
  • FIG. 6 is a structural block diagram of a naked-eye 3D display device based on human eye tracking provided in Embodiment 3 of the present disclosure
  • FIG. 7 is a schematic structural diagram of a device provided in Embodiment 4 of the present disclosure.
  • the layout parameters include the pitch of the layout period, and the width of each layout period can be quantified as 0 ⁇ 1.
  • the width of the grating film can be used as a reference. In one embodiment, it can be expressed by the number of rows of pixels covered by the grating film.
  • FIG. 1a is a schematic diagram of a layout cycle in the present disclosure. As shown in FIG.
  • each layout cycle includes at least two images, and any two of the at least two images enter the viewer's left and right eyes respectively to form a naked-eye 3D effect.
  • 0 to 1 represents a period, similar to a sine period, and the sine period is 2 ⁇ . At least two images can be arranged in this period.
  • the period of 0 to 1 can include 28 images. In these 28 images, the phase of the period from 0 to 1 is divided into equal proportions, that is, 0 indicates that the phase is 0, and 1 indicates that the phase is 2 ⁇ . In this case, the unit between 0 and 1 can be understood as the radian of the phase.
  • FIG. 1b is a schematic diagram of the relationship between the binocular forward and backward movement and the layout cycle width in the present disclosure.
  • the width of the layout period should be the length of the line segment EF;
  • the width of the layout cycle at this time should be the length of the line segment DF.
  • the width of the layout period is related to the viewing distance.
  • the embodiment of the present disclosure establishes an inverse proportional relationship between the eyes of the viewer and the display screen.
  • the width of the layout period can be obtained, and then the layout period can be obtained according to the obtained layout period.
  • the width of the display content is adjusted to improve the display effect of the naked eye 3D.
  • FIG. 2 is a flowchart of a naked eye 3D display method based on human eye tracking provided in Embodiment 1 of the present disclosure.
  • the method may be performed by a naked eye 3D display device based on human eye tracking, and the device may be implemented by software and / or hardware. It is realized in a manner that the device can be integrated in a control device that plays and displays content.
  • the method in this embodiment includes:
  • Step 110 Determine the current distance between the viewer's eyes and the display screen according to the geometric relationship between the first and second cameras and the viewer's eyes set on the display screen.
  • the first camera and the second camera may be camera modules provided on the display screen, or external cameras that are fixedly mounted on the display screen and have a communication connection with the control device and face the viewing area of the display screen.
  • the display screen in this embodiment may be a display screen of a small display device such as a mobile phone, a tablet computer, or a notebook computer, or may be a display screen of a large display device such as a large-screen color TV or an advertising machine.
  • a grating film is arranged on the display screen in this embodiment.
  • the grating film may be a lenticular grating film or a slit grating film.
  • the arrangement direction of the grating film may be perpendicular to the bottom edge of the display screen, or it may be laid at a certain inclined angle with the bottom edge of the display screen.
  • the cameras in the embodiments of the present disclosure are all perpendicular to the display screen. At this time, the optical axes of the two cameras are parallel. If there is an angle between the optical axes of the two cameras, the angle between the cameras must be calibrated first to ensure the vertical state of the camera and the display.
  • the position of the eyes of the viewer in this embodiment may be the center position of the eyes of the viewer, and may also be the left or right eye of the viewer.
  • the geometric relationship between the first camera, the second camera, and the viewer's eyes may be a triangle established in space.
  • the current distance of the viewer's eyes from the display is the height of the triangle.
  • there are multiple ways to calculate the height of a triangle For example, the value of the height of the triangle can be calculated by determining the length of two sides and the angle between the two sides.
  • FIG. 3a is a schematic diagram of a geometric relationship between a first camera, a second camera, and a viewer's eyes according to Embodiment 1 of the present disclosure. As shown in Figure 3a, F is the center position of the viewer's eyes. The distance between the first camera 10 and the second camera 20 is a known constant.
  • the The size of the two angles ⁇ B can calculate the high AC of the triangle according to the cosine theorem, that is, the current distance of the viewer's eyes from the display screen.
  • Step 120 Determine the width of the target layout period corresponding to the current distance based on a mapping relationship between a preset distance between the viewer's eyes and the display screen and the width of the layout period.
  • the mapping relationship is an inverse relationship, that is, the larger the current distance of the viewer's eyes from the display screen, the smaller the width of the target layout period; the smaller the current distance of the viewer's eyes from the display screen, the target The wider the layout cycle is.
  • the width of the corresponding target layout period can be directly determined according to the current distance, so that the layout content can be adjusted according to the width of the target layout period.
  • Step 130 Adjust and display the layout content according to the width of the target layout period.
  • the position of the layout reference line can be used as the starting position, that is, the viewpoint position on the layout reference line does not change, so that other viewpoint positions occur.
  • the number of sub-pixels corresponding to each viewpoint can be increased, thereby increasing the number of sub-pixels corresponding to the horizontal direction of the screen in each layout cycle, that is, making the layout
  • the width of the cycle increases.
  • any camera in this embodiment is used as a reference camera.
  • the pixel coordinates of the reference camera in the coordinate system where the display screen is located can be used as the layout base point of the software layout.
  • the line segment is used as the baseline of the chart.
  • FIG. 3b is a schematic diagram of a layout cycle adjustment method provided in Embodiment 1 of the present disclosure.
  • FIG. 3b shows an adjustment method for increasing the width of the layout period.
  • the starting positions of the viewpoints corresponding to the layout content of all points on the layout reference line AB do not change, and the starting positions of the viewpoints of other points have been correspondingly biased. shift.
  • the image displayed on the screen is adjusted corresponding to the movement of the viewer, avoiding the occurrence of crosstalk, and improving the viewing experience of the user.
  • the width of the target layout period can be adjusted according to the actual distance. And the actual distance is obtained by using the relative positions of the two cameras on the display screen and the viewer's eyes.
  • the technical solution of this embodiment does not need to use the imaging distance of the eyes in the image to simulate the actual distance between the eyes and the display screen. Therefore, this solution is suitable for users with any interpupillary distance, and there is no limitation of individual differences.
  • the technical solution of this embodiment focuses on the current distance of the viewer's eyes from the display screen, so as to facilitate tracking of the human eyes in the subsequent front-rear direction. Therefore, the transformation of the spatial coordinate system is not required to determine the three-dimensional coordinates of the human eye in space, which greatly simplifies the calculation method of the layout cycle width, and the calculated layout cycle width is more accurate. By using the calculated layout cycle width to adjust layout content in real time, viewers can experience better naked eye 3D effects.
  • FIG. 4 is a flowchart of a naked-eye 3D display method based on human eye tracking provided in Embodiment 2 of the present disclosure. This embodiment is optimized on the basis of the foregoing embodiment, in which the same or corresponding terms as those in the foregoing embodiment are used. The explanation is not repeated here.
  • the method provided in this embodiment includes:
  • Step 210 Determine the first horizontal distance between the viewer's eyes from the optical axis of the first camera and the distance between the viewer's eyes from the optical axis of the second camera according to the imaging coordinates of the viewer's eyes in the first camera and the second camera, respectively. The second horizontal distance.
  • the positions of the first camera and the second camera are different, the positions of the eyes of the viewers in the images captured by the first camera and the second camera are also different.
  • the optical axis position of the camera is the midpoint of the image captured by the camera. Therefore, the imaging coordinates of the viewer's eyes in the first and second cameras can determine the distance between the viewer's eyes and the optical axis of the first camera. A first horizontal distance from the second horizontal distance from the optical axis of the second camera.
  • FIG. 5a is a schematic diagram of a positional relationship between a camera and a viewer's eyes provided in Embodiment 2 of the present disclosure.
  • the camera shown in FIG. 5a may be a first camera or a second camera.
  • two points C and D are the left and right boundary points of the imaging
  • F is the imaging coordinates of the viewer's eyes in the image.
  • E is the midpoint of the image captured by the camera.
  • Step 220 Calculate a third angle at which the viewer's eyes deviate from the optical axis of the first camera according to the first horizontal distance, the field angle of the first camera, and the resolution of the captured image.
  • the field angle of the camera is ⁇ CAD.
  • the field angle is 60 °.
  • the resolution of the image captured by the camera can be set in advance according to actual needs, for example, it can be set to 640 * 480.
  • the length of the CE when the resolution of the image captured by the camera is determined, the length of the CE may be determined to be a half of the number of pixels in the image length direction. For example, if the number of pixels in the image length direction is 640, the length of the CE is 320 pixels.
  • the size of ⁇ CAE can be determined, that is, one-half of the field of view. For example, if the field angle is 60 °, ⁇ CAE is 30 °.
  • the length of AE can be calculated.
  • the size of ⁇ EAF can be obtained, that is, arctan (EF / AE), that is, the third angle.
  • Step 230 Calculate a fourth angle at which the viewer's eyes deviate from the optical axis of the second camera according to the second horizontal distance, the field angle of the second camera, and the resolution of the captured image.
  • the calculation method of the fourth angle is the same as the calculation method of the third angle in the above step 220, and reference may be made to the calculation principle of the third angle, and details are not described herein again.
  • steps 220 and 230 there is no distinction between the execution order of steps 220 and 230, and they can be performed one after the other, or they can be performed simultaneously, which is not limited in this embodiment.
  • Step 240 Calculate a first angle at which the viewer's eyes deviate from the horizontal direction where the first camera is located based on the third angle, and calculate a second angle at which the viewer's eyes deviate from the horizontal direction where the second camera is located according to the fourth angle.
  • the first angle at which the viewer's eyes deviate from the horizontal direction where the first camera is located is (90 ° - ⁇ EAF).
  • the first angle is ⁇ A.
  • the second angle is (90 ° -fourth angle). This is ⁇ B in Figure 3a.
  • Step 250 Calculate the current distance between the viewer's eyes and the display screen according to the first angle, the second angle, and the distance between the first camera and the second camera.
  • the distance of FC can be calculated according to the triangle cosine theorem, that is, the current distance of the viewer's eyes from the display screen.
  • Step 260 Determine the width of the target layout period corresponding to the current distance based on an inverse proportional relationship between the distance of the viewer's eyes from the display screen and the width of the layout period.
  • FIG. 5b is a geometric relationship diagram of a human eye viewing a display screen provided in Embodiment 2 of the present disclosure.
  • d is a constant, which indicates the thickness of the horizontal cylinder lens
  • Z is the current distance of the viewer's eyes from the display screen
  • p0 is a constant, which indicates the width of the horizontal cylinder lens on the display surface
  • pitch is a The width of the layout cycle.
  • Each layout cycle can sequentially arrange 8 viewpoints. Based on the similarity between triangle GAB and triangle GEF, the inverse proportional relationship between pitch and Z can be derived.
  • the width of the target layout period corresponding to the current distance can be determined according to the following formula:
  • pitch is the width of the layout period
  • p0 is a constant, which indicates the width of the horizontal lenticular lens arranged on the display surface
  • d is a constant, which indicates the fitting thickness of the horizontal lenticular lens
  • Z is the current distance of the viewer's eyes from the display screen .
  • FIG. 5c is a schematic diagram of calibrating the parameters of the inverse proportional function provided in Embodiment 2 of the present disclosure
  • FIG. 5d is another schematic diagram of calibrating the parameters of the inverse proportional function provided by Embodiment 2 of the present disclosure.
  • the pitch0 value at point A and the pitch1 value at point B can be obtained by using an external camera.
  • the distance between A and B and the display screen can be set according to actual needs. For example, the distance between point A and the display screen can be set to 60 cm, and the distance from point A to the display screen is 100 cm.
  • the method for obtaining pitch0 or pitch1 may be: based on the width of the reference layout period when the display screen is shipped from the factory, by continuously adjusting the display image of the display screen until the image displayed on the display screen reaches the target image, Determine the width pitch of the layout period corresponding to the target image, and set the measured pitch to the target pitch of the external camera at the current position.
  • Step 270 Adjust and display the layout content according to the width of the target layout period.
  • this embodiment determines the first angle at which the viewer ’s eyes deviate from the horizontal direction where the first camera is located, and where the second camera is located, by constructing a triangular relationship between the camera and the position of the human eye in the space.
  • the second angle in the horizontal direction, and the current distance of the viewer's eyes from the display screen is calculated according to the first angle, the second angle, and the distance between the first camera and the second camera.
  • FIG. 6 is a structural block diagram of a naked-eye 3D display device based on human eye tracking according to Embodiment 3 of the present disclosure.
  • the device includes a current distance determination module 310, a layout period width determination module 320, and a layout content adjustment module 330. among them,
  • the current distance determining module 310 is configured to determine a current distance between the viewer's eyes and the display according to the geometric relationship between the first camera and the second camera and the viewer's eyes provided on the display;
  • the layout period width determining module 320 is configured to determine the width of the target layout period corresponding to the current distance based on a mapping relationship between a preset distance between the viewer's eyes and the display screen and the width of the layout period;
  • the layout content adjustment module 330 is configured to adjust and display the layout content according to the width of the target layout period.
  • the width of the target layout period can be adjusted according to the actual distance. And the actual distance is obtained by using the relative positions of the two cameras on the display screen and the viewer's eyes.
  • the technical solution of this embodiment does not need to use the imaging distance of the eyes in the image to simulate the actual distance between the eyes and the display screen Therefore, this solution is suitable for users with any interpupillary distance, and there is no limitation of individual differences.
  • the technical solution of this embodiment focuses on the current distance of the viewer's eyes from the display screen, so as to facilitate the tracking of the human eyes in the subsequent front-back direction. Therefore, the transformation of the spatial coordinate system is not required to determine the three-dimensional coordinates of the human eye in space, which greatly simplifies the calculation method of the layout cycle width, and the calculated layout cycle width is more accurate. By using the calculated layout cycle width to adjust layout content in real time, viewers can experience better naked eye 3D effects.
  • the current distance determining module 310 includes:
  • An angle calculation unit configured to calculate a first angle at which a viewer's eyes deviate from a horizontal direction where the first camera is located, and a second angle at which the viewer's eyes deviate from a horizontal direction where the second camera is located;
  • the current distance calculation unit is configured to calculate the current distance of the viewer's eyes from the display screen according to the first angle, the second angle, and the distance between the first camera and the second camera.
  • the angle calculation unit is configured to:
  • the first horizontal distance between the viewer's eyes and the optical axis of the first camera, and the second horizontal distance between the viewer's eyes and the optical axis of the second camera are determined. distance;
  • a first angle at which the viewer's eyes deviate from the horizontal direction where the first camera is located and a second angle at which the viewer's eyes deviate from the horizontal direction where the second camera is located according to the fourth angle.
  • the mapping relationship is an inverse proportional relationship.
  • the layout period width determining module 320 is configured to:
  • pitch is the width of the layout period
  • p0 is a constant, which indicates the width of the horizontal lenticular lens arranged on the display surface
  • d is a constant, which indicates the fitting thickness of the horizontal lenticular lens
  • z is the current distance of the viewer's eyes from the display .
  • the naked eye 3D display device based on human eye tracking provided by the embodiment of the present disclosure can execute the naked eye 3D display method based on human eye tracking provided by any embodiment of the present disclosure, and has corresponding function modules and effects of the execution method.
  • a naked eye 3D display method based on human eye tracking provided by any embodiment of the present disclosure can execute the naked eye 3D display method based on human eye tracking provided by any embodiment of the present disclosure, and has corresponding function modules and effects of the execution method.
  • FIG. 7 is a schematic structural diagram of a device provided in Embodiment 4 of the present disclosure.
  • FIG. 7 shows a block diagram of an exemplary device 12 suitable for use in implementing embodiments of the present disclosure.
  • the device 12 shown in FIG. 7 is only an example, and should not impose any limitation on the functions and scope of use of the embodiments of the present disclosure.
  • the device 12 is represented in the form of a general-purpose computing device.
  • the components of the device 12 may include, but are not limited to, one or more processors or processing units 16, a system memory 28, and a bus 18 connecting different system components (including the system memory 28 and the processing unit 16).
  • the bus 18 represents one or more of several types of bus structures, including a memory bus or a memory controller, a peripheral bus, a graphics acceleration port, a processor, or a local area bus using any of a variety of bus structures.
  • these architectures include, but are not limited to, the Industry Standard Architecture (ISA) bus, the Micro Channel Architecture (MAC) bus, the enhanced ISA bus, and the Video Electronics Standard Association (VESA) local bus and Peripheral Component Interconnect (PCI) bus.
  • ISA Industry Standard Architecture
  • MAC Micro Channel Architecture
  • VESA Video Electronics Standard Association
  • PCI Peripheral Component Interconnect
  • the device 12 typically includes a variety of computer system-readable media. These media can be any available media that can be accessed by the device 12, including volatile and non-volatile media, removable and non-removable media.
  • System memory 28 may include computer system readable media in the form of volatile memory, such as random access memory (RAM) 30 and / or cache memory 32.
  • Device 12 may include other removable / non-removable, volatile / nonvolatile computer system storage media.
  • the storage system 34 may be used to read and write non-removable, non-volatile magnetic media (not shown in FIG. 7 and is commonly referred to as a "hard drive").
  • each drive may be connected to the bus 18 through one or more data medium interfaces.
  • the memory 28 may include at least one program product having a set (eg, at least one) of program modules configured to perform the functions of one or more embodiments of the present disclosure.
  • a program / utility tool 40 having a set (at least one) of program modules 42 may be stored in, for example, the memory 28.
  • Such program modules 42 include, but are not limited to, an operating system, one or more application programs, other program modules, and program data Each of these examples, or some combination, may include an implementation of a network environment.
  • the program module 42 generally performs functions and / or methods in the embodiments described in the present disclosure.
  • the device 12 may also communicate with one or more external devices 14 (such as a keyboard, pointing device, display 24, etc.), and may also communicate with one or more devices that enable a user to interact with the device 12, and / or with the device that enables
  • the device 12 can communicate with any device (e.g., a network card, modem, etc.) that can communicate with one or more other computing devices. This communication can be performed through an input / output (I / O) interface 22.
  • the device 12 may also communicate with one or more networks (such as a local area network (LAN), a wide area network (WAN), and / or a public network, such as the Internet) through the network adapter 20. As shown, the network adapter 20 communicates with other modules of the device 12 via the bus 18.
  • LAN local area network
  • WAN wide area network
  • public network such as the Internet
  • the processing unit 16 executes one or more functional applications and data processing by running a program stored in the system memory 28, for example, implementing a naked eye 3D display method based on human eye tracking provided by the embodiment of the present disclosure.
  • the method includes:
  • the layout content is adjusted and displayed according to the width of the target layout period.
  • the processor may also implement the technical solution of the naked eye 3D display method based on human eye tracking provided by any embodiment of the present disclosure.
  • Embodiment 5 of the present disclosure also provides a computer-readable storage medium on which a computer program is stored.
  • the program is executed by a processor, the naked eye 3D display method based on human eye tracking provided by any embodiment of the present disclosure is implemented.
  • the method includes:
  • the layout content is adjusted and displayed according to the width of the target layout period.
  • the computer storage medium of the embodiment of the present disclosure may adopt any combination of one or more computer-readable media.
  • the computer-readable medium may be a computer-readable signal medium or a computer-readable storage medium.
  • the computer-readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination thereof.
  • Computer-readable storage media includes: electrical connections with one or more wires, portable computer disks, hard disks, Random Access Memory (RAM), Read-Only Memory , ROM), Erasable Programmable Read-Only Memory (EPROM) or flash memory, optical fiber, portable compact disk read-only memory (CD-ROM), optical storage device, magnetic storage device, or the above Any suitable combination.
  • a computer-readable storage medium may be any tangible medium that contains or stores a program that can be used by or in combination with an instruction execution system, apparatus, or device.
  • the computer-readable signal medium may include a data signal in baseband or propagated as part of a carrier wave, which carries a computer-readable program code. Such a propagated data signal may take many forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the foregoing.
  • the computer-readable signal medium may also be any computer-readable medium other than a computer-readable storage medium, and the computer-readable medium may send, propagate, or transmit a program for use by or in connection with an instruction execution system, apparatus, or device .
  • the program code contained on the computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wire, optical fiber cable, radio frequency (RF), etc., or any suitable combination of the foregoing.
  • any appropriate medium including but not limited to wireless, wire, optical fiber cable, radio frequency (RF), etc., or any suitable combination of the foregoing.
  • Computer program code for performing the operations of the present disclosure may be written in one or more programming languages, or combinations thereof, including programming languages such as Java, Smalltalk, C ++, and also conventional Procedural programming language-such as "C" or similar programming language.
  • the program code can be executed entirely on the user's computer, partly on the user's computer, as an independent software package, partly on the user's computer, partly on a remote computer, or entirely on a remote computer or server.
  • the remote computer can be connected to the user's computer through any kind of network, including a local area network (LAN) or wide area network (WAN), or it can be connected to an external computer (such as through an Internet service provider Internet connection).
  • LAN local area network
  • WAN wide area network
  • Internet service provider Internet connection such as AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

本文公开了一种基于人眼跟踪的裸眼三维3D显示方法,包括:根据显示屏上设置的第一摄像头和第二摄像头以及观看者眼睛之间的几何关系,确定观看者眼睛距离显示屏的当前距离;基于预先设定的观看者眼睛距离显示屏的距离与排图周期的宽度之间的映射关系,确定所述当前距离对应的目标排图周期的宽度;按照所述目标排图周期的宽度对排图内容进行调整并显示。本文还公开了一种基于人眼跟踪的裸眼3D显示装置、设备以及存储介质。

Description

基于人眼跟踪的裸眼3D显示方法、装置、设备和介质
本申请要求在2018年7月23日提交中国专利局、申请号为201810813867.7的中国专利申请的优先权,该申请的全部内容通过引用结合在本申请中。
技术领域
本公开实施例涉及三维(three dimensional,3D)显示技术领域,例如涉及一种基于人眼跟踪的裸眼3D显示方法、装置、设备和介质。
背景技术
裸眼3D显示技术是指无需佩戴专用的3D眼镜,观者即可直接以肉眼观赏三维影像,呈现3D效果的一种3D显示技术。裸眼3D显示技术包括柱镜光栅、狭缝光栅、以及液晶透镜等,目前应用最广的是柱镜光栅3D显示技术。柱镜光栅3D显示技术的原理是通过在常规显示屏的前面贴附一层特制柱状透镜来实现的。在每个柱状透镜下面的图像的像素被分成几个子像素,这样透镜就能以不同的方向投影每个子像素。当用户在观看3D显示内容时,左眼和右眼分别看到不同子像素所发出的光线,使人的左眼和右眼看到不同的画面,并在大脑中融合成3D效果的画面。
但是,由于每个柱状透镜将像素点内容分开投射到左右眼,因此在没有进行人眼跟踪的情况下,固定的光线分布,使用户要寻找合适的观看位置,才能看到理想的立体效果。当用户的观看位置不合适时,进入左眼的光线可能进入右眼,此时右眼既可看到左图又可看到右图,容易产生串扰(crosstalk),用户体验较差。因此,要实时对人眼位置进行追踪,并根据采集到的人眼位置对显示内容进行调整。
其中,人眼追踪包括前后追踪和上下左右追踪。目前,当实现人眼前后追踪时,利用屏幕上设置的摄像头拍摄包含有观看者双眼的图像,然后根据双眼在摄像头所拍摄到的图像中的成像距离与实际排图周期宽度的关系来计算排图周期的宽度。而上述关系的标定是基于观看者的固定的瞳距为前提,该固定的瞳距一般为6.5厘米。但实际情况是,每个人的瞳距不都是精准的6.5厘米,存在个体差异性,所以相关技术中根据双眼在图像中的成像距离计算出的排图周期的宽度并不准确。
发明内容
本公开实施例提供一种基于人眼跟踪的裸眼3D显示方法、装置、设备和介质,以实现跟踪用户的前后移动,并对显示内容的排图周期进行精准调节。
在一实施例中,本公开实施例提供了基于人眼跟踪的裸眼3D显示方法,该方法包括:
根据显示屏上设置的第一摄像头和第二摄像头以及观看者眼睛之间的几何关系,确定所述观看者眼睛距离所述显示屏的当前距离;
基于预先设定的观看者眼睛距离显示屏的距离与排图周期的宽度之间的映射关系,确定所述当前距离对应的目标排图周期的宽度;
按照所述目标排图周期的宽度对排图内容进行调整并显示。
在一实施例中,本公开实施例还提供了基于人眼跟踪的裸眼3D显示装置,该装置包括:
当前距离确定模块,设置为根据显示屏上设置的第一摄像头和第二摄像头以及观看者眼睛之间的几何关系,确定观看者眼睛距离显示屏的当前距离;
排图周期宽度确定模块,设置为基于预先设定的观看者眼睛距离显示屏的 距离与排图周期的宽度之间的映射关系,确定所述当前距离对应的目标排图周期的宽度;
排图内容调整模块,设置为按照所述目标排图周期的宽度对排图内容进行调整并显示。
在一实施例中,本公开实施例还提供了一种设备,该设备包括:
一个或多个处理器;
存储装置,设置为存储一个或多个程序,
当所述一个或多个程序被所述一个或多个处理器执行,使得所述一个或多个处理器实现本公开任意实施例所提供的基于人眼跟踪的裸眼3D显示方法。
在一实施例中,本公开实施例还提供了一种计算机可读存储介质,其上存储有计算机程序,该程序被处理器执行时实现本公开任意实施例所提供的基于人眼跟踪的裸眼3D显示方法。
本公开实施例实现了根据观看者眼睛与显示屏的实际距离来调整目标排图周期的宽度。而且,由于本公开实施例的技术方案只关注人眼距离显示屏的实际距离,因此无需按照相关技术提供的通过空间坐标系转换的方式来计算人眼在空间中三维坐标,而是根据两个摄像头与人眼的几何关系直接计算实际距离即可,极大地简化了该实际距离的计算过程。此外,本公开实施例提供的根据人眼距离屏幕的实际距离直接确定排图周期宽度的方式,相对于相关技术提供的按照人的双眼在图像中成像距离计算排图周期宽度的方式,本公开实施例的技术方案无需用双眼在图像中的成像距离来模拟人眼距离显示屏实际距离,所以,本公开实施例的技术方案可适用于任何瞳距的用户,不存在个体差异的局限性,提高了排图周期宽度的计算精度。通过采用计算出的排图周期宽度来实时调整排图内容,可使用户体验到更好的裸眼3D效果。
附图说明
图1a为本公开中的一种排图周期的示意图;
图1b为本公开中的双眼前后移动与排图周期宽度的关系示意图;
图2为本公开实施例一提供的一种基于人眼跟踪的裸眼3D显示方法的流程图;
图3a为本公开实施例一提供的一种第一摄像头、第二摄像头以及观看者眼睛之间的几何关系示意图;
图3b为本公开实施例一提供的一种排图周期调整方式的示意图;
图4为本公开实施例二提供的一种基于人眼跟踪的裸眼3D显示方法的流程图;
图5a为本公开实施例二提供的摄像头与观看者眼睛的位置关系示意图;
图5b为本公开实施例二提供的一种人眼观看显示屏的几何关系图;
图5c为本公开实施例二提供的一种对反比例函数的参数进行标定示意图;
图5d为本公开实施例二提供的另一种对反比例函数的参数进行标定示意图;
图6为本公开实施例三提供的一种基于人眼跟踪的裸眼3D显示装置的结构框图;
图7为本公开实施例四提供的一种设备的结构示意图。
具体实施方式
下面结合附图和实施例对本公开进行说明。此处所描述的实施例仅仅用于解释本公开,而非对本公开的限定。为了便于描述,附图中仅示出了与本公开相关的部分而非全部结构。
在实现裸眼3D显示的过程中,为了能够让用户体验到理想的裸眼3D显示 效果,本公开实施例的技术方案采用了人眼追踪技术,即:跟随观看者眼睛的移动位置来调整显示内容的排图参数。其中,排图参数包括排图周期的宽度pitch,每个排图周期的宽度可量化为0~1。对于每个排图周期实际的物理宽度,可以以光栅膜的宽度为基准,在一实施例中,可以通过光栅膜所覆盖的行像素的个数来表示。图1a为本公开中的一种排图周期的示意图。如图1a所示,将多幅图的内容顺序排列在0~1区间的宽度内,作为一个排图周期。每个排图周期中均包括至少两幅图像,对于至少两幅图像中的任意两幅图像分别进入观看者的左眼和右眼,以形成裸眼3D效果。
在一实施例中,0~1表示一个周期,类似于一个正弦周期,该正弦周期为2π,在该周期内可以排列至少两幅图像,例如,0~1这一周期可以包括28幅图像,这28幅图像将0~1这一周期的相位进行等比例划分,即,0表示相位为0,1表示相位为2π。在这种情况下,0~1区间的单位可以理解为相位的弧度。
此外,本公开实施例的技术方案所涉及到的人眼追踪,是针对于观看者在向靠近显示屏的方向或远离显示屏的方向移动时的人眼位置追踪。通过建立观看者眼睛与显示屏的距离与排图周期宽度之间的映射关系,可根据实时获取到的观看者眼睛与显示屏的实际距离,从而可得到排图周期的宽度。图1b为本公开中的双眼前后移动与排图周期宽度的关系示意图。如图1b所示,在理想情况下,当眼睛在A点处时,要保证透过相邻柱透镜看到O点的显示内容,排图周期的宽度应该是线段EF的长度;当眼睛位于B处时,为了透过相邻的两个柱状透镜看到O点所显示的内容,这时的排图周期的宽度应该是线段DF的长度。显然,排图周期的宽度与观看距离有关,当观看者双眼中点坐标从B点移动到A点时,排图周期的宽度应从DF增大的EF;当观看者双眼中点坐标从A点移动到B点时,排图周期的宽度应从EF减小的DF。因此,本公开实施例构建了 观看者双眼与显示屏之间的反比例关系,通过实时获取观看者眼睛距离显示屏的当前距离,即可得到排图周期的宽度,然后可按照得到的排图周期的宽度对显示内容进行调整,提升裸眼3D的显示效果。
实施例一
图2为本公开实施例一提供的一种基于人眼跟踪的裸眼3D显示方法的流程图,该方法可以由基于人眼跟踪的裸眼3D显示装置来执行,该装置可以通过软件和/或硬件的方式实现,该装置可以集成在播放显示内容的控制设备中。参见图2,本实施例的方法包括:
步骤110、根据显示屏上设置的第一摄像头和第二摄像头以及观看者眼睛之间的几何关系,确定观看者眼睛距离显示屏的当前距离。
其中,第一摄像头和第二摄像头可以为显示屏自带的摄像头模组,也可为固定安装在显示屏上,并与控制设备存在通信连接且面向显示屏观看区域的外接摄像头。
本实施例中的显示屏可以为手机、平板电脑或笔记本电脑等小型显示设备的显示屏,也可以为大屏彩电或广告机等大型显示设备的显示屏。为了形成裸眼3D的显示效果,本实施例中的显示屏上布设有光栅膜。该光栅膜可以为柱镜光栅膜或狭缝光栅膜。光栅膜的布设方向可以为垂直与显示屏的底边布设,也可以与显示屏的底边存在一定的倾斜角度布设。
在一实施例中,本公开实施例中的摄像头均与显示屏垂直。此时,两个摄像头的光轴平行。如果两个摄像头的光轴之间存在夹角,则要先对摄像头之间的夹角进行标定,以保证摄像头与显示屏的垂直状态。
本实施例中的观看者眼睛位置可以为观看者双眼的中心位置,也可以为观看者的左眼或右眼。本实施例中第一摄像头、第二摄像头以及观看者眼睛之间 的几何关系可以为在空间中所建立的三角形。观看者眼睛距离显示屏的当前距离为三角形的高。在一实施例中,三角形的高的计算方式有多种,例如,可通过确定两条边的长度及这两条边的夹角计算该三角形高的值。
在一实施例中,本实施采用根据两个角的角度和一条边的长度,并根据余弦定理例计算三角形的高。图3a为本公开实施例一提供的一种第一摄像头、第二摄像头以及观看者眼睛之间的几何关系示意图。如图3a所示,F为观看者双眼的中心位置。其中,第一摄像头10和第二摄像头20之间的距离为已知的常数,通过计算观看者眼睛偏离第一摄像头所在水平方向的第一角度∠A,以及偏离第二摄像头所在水平方向的第二角度∠B的大小,即可根据余弦定理计算出三角形的高AC,即观看者眼睛距离显示屏的当前距离。
步骤120、基于预先设定的观看者眼睛距离显示屏的距离与排图周期的宽度之间的映射关系,确定当前距离对应的目标排图周期的宽度。
示例性的,本实施例中,映射关系为反比例关系,即观看者眼睛距离显示屏的当前距离越大,目标排图周期的宽度越小;观看者眼睛距离显示屏的当前距离越小,目标排图周期的宽度越大。基于这样的反比例关系,可以直接根据当前距离确定对应的目标排图周期的宽度,从而可以按照该目标排图周期的宽度对排图内容进行调整。
步骤130、按照目标排图周期的宽度对排图内容进行调整并显示。
其中,当按照目标排图周期的宽度对排图内容进行调整时,可以以排图基准线所在的位置为起始位置,即排图基准线上的视点位置不发生变化,使得其他视点位置发生偏移,例如,通过移动视点位置,可增加每个视点所对应的子像素个数,从而使得每个排图周期在屏幕的水平方向所对应的子像素的个数增多,即:使排图周期的宽度增加。
其中,以本实施例中的任意一个相机为基准相机,该基准相机在显示屏所在坐标系中的像素坐标,可以作为软件排图的排图基点,经过排图基点将平行于光栅膜布设方向的线段作为排图基准线。
示例性的,图3b为本公开实施例一提供的一种排图周期调整方式的示意图。图3b示出的是一种增加排图周期宽度的调整方式。如图3b所示,在调整排图周期宽度时,排图基准线AB上的所有点对应排图内容的视点的起始位置不发生改变,其他点的视点起始位置都发生了相应的偏移。通过上述设置,在屏幕上所显示的图像跟随观看者的移动发生了相应的调整,避免了串扰现象的发生,提升了用户的观看体验。
本实施例的技术方案,通过确定观看者眼睛距离屏幕的当前实际距离,可根据该实际距离来调整目标排图周期的宽度。并且该实际距离是利用显示屏上的两个摄像头与观看者眼睛的相对位置得到的。相对于相关技术提供的按照人的双眼在图像中成像距离计算排图周期宽度的技术方案,本实施例的技术方案无需用双眼在图像中的成像距离来模拟眼睛与显示屏之间的实际距离,因此,该方案适用于任何瞳距的用户,不存在个体差异的局限性。此外,本实施例的技术方案重点关注的是观看者眼睛距离显示屏的当前距离,以便于后续前后方向的人眼追踪。因此无需进行空间坐标系的转换来确定人眼在空间中的三维坐标,极大地简化了排图周期宽度的计算方法,并且所计算出的排图周期的宽度准确性更高。通过采用计算得到的排图周期宽度来实时调整排图内容,可使观看者体验到更好的裸眼3D效果。
实施例二
图4为本公开实施例二提供的一种基于人眼跟踪的裸眼3D显示方法的流程 图,本实施例在上述实施例的基础上进行了优化,其中与上述实施例相同或相应的术语的解释在此不再赘述。参见图4,本实施例提供的方法包括:
步骤210、根据观看者眼睛分别在第一摄像头和第二摄像头中的成像坐标,确定观看者眼睛距离第一摄像头的光轴的第一水平距离,以及观看者眼睛距离第二摄像头的光轴的第二水平距离。
其中,由于第一摄像头和第二摄像头的位置不同,因此,第一摄像头和第二摄像头所拍摄的图像中,观看者眼睛的位置也不同。一般情况下,摄像头的光轴位置为摄像头所拍摄图像的中点位置,因此,通过观看者眼睛在第一摄像头和第二摄像头中的成像坐标,可确定观看者眼睛距离第一摄像头的光轴的第一水平距离,以及距离第二摄像头的光轴的第二水平距离。
示例性的,图5a为本公开实施例二提供的摄像头与观看者眼睛的位置关系示意图。图5a中示出的摄像头可以为第一摄像头,也可以为第二摄像头。如图5a所示,C、D两点分别为成像的左右边界点,F为观看者眼睛在图像中的成像坐标。E为摄像头所拍摄图像的中点。通过识别所拍摄的图像,可以得到F距离E的第一水平距离。按照相同的计算方式,对于屏幕上的另外一个摄像头,可以得到F距离E的第二水平距离。
步骤220、根据第一水平距离、第一摄像头的视场角和所拍摄图像的分辨率,计算观看者眼睛偏离第一摄像头的光轴的第三角度。
示例性的,如图5a所示,摄像头的视场角为∠CAD,本实施例中,视场角为60°。摄像头所拍摄图像的分辨率可根据实际需求预先进行设置,例如可以设置为640*480。
在一实施例中,如图5a所示,在确定摄像头所拍摄图像的分辨率的情况下,可确定CE的长度即为图像长度方向像素个数的二分之一。例如,如果图像长度 方向像素个数为640,则CE的长度即为320个像素。此外,在摄像头视场角确定的情况下,可确定∠CAE的大小,即视场角的二分之一。例如,如果视场角为60°,则∠CAE为30°。
根据CE的长度,∠CAE的大小,可计算出AE的长度。在AE长度确定后,根据EF的距离,然后可得到∠EAF的正切值,即tan(∠EAF)=EF/AE。从而可得到∠EAF的大小,即arctan(EF/AE),也即第三角度。
步骤230、根据所述第二水平距离、第二摄像头的视场角和所拍摄图像的分辨率,计算观看者眼睛偏离第二摄像头的光轴的第四角度。
示例性的,第四角度的计算方式与上述步骤220中第三角度的计算方式相同,可以参照第三角度的计算原理,此处不再赘述。
在一实施例中,步骤220和步骤230不存在执行顺序的先后之分,可以先后执行,也可以同时进行,本实施例对此不作限定。
步骤240、根据第三角度,计算观看者眼睛偏离第一摄像头所在水平方向的第一角度,并根据第四角度,计算观看者眼睛偏离第二摄像头所在水平方向的第二角度。
如图5a所示,在得到第三角度之后,观看者眼睛偏离第一摄像头所在水平方向的第一角度即为(90°-∠EAF)。如图3a和图5a所示,第一角度即为∠A。同理可得,第二角度即为(90°-第四角度)。也即图3a中的∠B。
步骤250、根据第一角度、第二角度以及第一摄像头和第二摄像头的间距,计算观看者眼睛距离显示屏的当前距离。
示例性的,如图3a所示,在∠A、∠B和AB之间的距离都确定后,可根据三角形余弦定理计算出FC的距离,即观看者眼睛距离显示屏的当前距离。
步骤260、基于预先设定的观看者眼睛距离显示屏的距离与排图周期的宽度 之间反比例关系,确定当前距离对应的目标排图周期的宽度。
其中,本实施例中反比例关系的推导,可参照图5b示出的几何关系图。其中,图5b为本公开实施例二提供的一种人眼观看显示屏的几何关系图。如图5b所示,d为常量,表示水平柱镜的贴合厚度,Z为观看者眼睛距离显示屏的当前距离,p0为常量,表示显示屏表面布设的水平柱镜的宽度,pitch为一个排图周期的宽度,每个排图周期可对8个视点进行依次排列。通过根据三角形GAB与三角形GEF相似,可推导出pitch与Z的反比例关系。
在一实施例中,可根据如下公式,确定当前距离对应的目标排图周期的宽度:
pitch=p0+p0×(d÷z);
其中,pitch为排图周期的宽度;p0为常量,表示显示屏表面布设的水平柱镜的宽度;d为常量,表示水平柱镜的贴合厚度;Z为观看者眼睛距离显示屏的当前距离。
在一实施例中,由于参数p0和d无法测量,因此,为了确定参数p0和d,要分别测量空间上两点的Z和pitch。
在一实施例中,图5c为本公开实施例二提供的一种对反比例函数的参数进行标定示意图,图5d为本公开实施例二提供的另一种对反比例函数的参数进行标定示意图。如5c所示,可分别利用外置摄相头获取A点处的pitch0,和B点处的pitch1值。其中,A、B与显示屏的距离可根据实际需求进行设置,例如可以设置为B点到显示屏的而距离为60厘米,A点到显示屏的距离为100厘米。
在一实施例中,获取pitch0或pitch1的方式可以为:以显示屏出厂时基准排图周期的宽度为基础,通过不断调节显示屏的显示图像,直到显示屏所显示的图像达到目标图像时,确定该目标图像所对应的排图周期的宽度pitch,将所 测得的pitch设置为外置摄像头在当前位置的目标pitch。
如图5d中,在A、B位置先后放置带一个黑点的白板,然后再利用本实施例中基于双摄像头的人眼跟踪程序,可分别计算出黑点坐标B和A距离显示屏的距离,即得到对应的Z0和Z1。最后将得到的pitch0、Z0、pitch1和Z1分别代入公式y=a+b/x中,可得到常数a和b,即为上述反比例函数中的p0和d。反比例函数确定后,可根据该函数实现人眼位置的前后跟踪。
步骤270、按照目标排图周期的宽度对排图内容进行调整并显示。
本实施例在上述实施例的基础上,通过构建空间中摄像头与人眼位置之间的三角形关系,确定出了观看者眼睛偏离第一摄像头所在水平方向的第一角度,以及偏离第二摄像头所在水平方向的第二角度,并根据第一角度、第二角度以及第一摄像头和第二摄像头的间距计算出观看者眼睛距离显示屏的当前距离。相对于采用空间坐标系转换的方式,本实施例的技术方按简化了当前距离的计算方式,操作方法简单,实用性强,而且计算得到的排图周期的宽度更加准确,为观看者前后的追踪服务提供了极大的便利。
实施例三
图6为本公开实施例三提供的一种基于人眼跟踪的裸眼3D显示装置的结构框图,该装置包括:当前距离确定模块310、排图周期宽度确定模块320和排图内容调整模块330。其中,
当前距离确定模块310,设置为根据显示屏上设置的第一摄像头和第二摄像头以及观看者眼睛之间的几何关系,确定所述观看者眼睛距离所述显示屏的当前距离;
排图周期宽度确定模块320,设置为基于预先设定的观看者眼睛距离显示屏 的距离与排图周期的宽度之间的映射关系,确定所述当前距离对应的目标排图周期的宽度;
排图内容调整模块330,设置为按照所述目标排图周期的宽度对排图内容进行调整并显示。
本实施例的技术方案,通过确定观看者眼睛距离屏幕的当前实际距离,可根据该实际距离来调整目标排图周期的宽度。并且该实际距离是利用显示屏上的两个摄像头与观看者眼睛的相对位置得到的。相对于相关技术提供的按照人的双眼在图像中成像距离计算排图周期宽度的技术方案,本实施例的技术方案无需用双眼在图像中的成像距离来模拟眼睛与显示屏之间的实际距离,因此,该方案适用于任何瞳距的用户,不存在个体差异的局限性。此外,本实施例的技术方案关注的是观看者眼睛距离显示屏的当前距离,以便于后续前后方向的人眼追踪。因此无需进行空间坐标系的转换来确定人眼在空间中的三维坐标,极大地简化了排图周期宽度的计算方法,并且所计算出的排图周期的宽度准确性更高。通过采用计算得到的排图周期宽度来实时调整排图内容,可使观看者体验到更好的裸眼3D效果。
在上述实施例的基础上,所述当前距离确定模块310,包括:
角度计算单元,设置为计算观看者眼睛偏离第一摄像头所在水平方向的第一角度,以及偏离第二摄像头所在水平方向的第二角度;
当前距离计算单元,设置为根据第一角度、第二角度以及第一摄像头和第二摄像头的间距计算观看者眼睛距离显示屏的当前距离。
在上述实施例的基础上,所述角度计算单元是设置为:
根据观看者眼睛分别在第一摄像头和第二摄像头中的成像坐标,确定观看者眼睛距离第一摄像头的光轴的第一水平距离,以及观看者眼睛距离第二摄像 头的光轴的第二水平距离;
根据所述第一水平距离、第一摄像头的视场角和所拍摄图像的分辨率,计算观看者眼睛偏离第一摄像头的光轴的第三角度;
根据所述第二水平距离、第二摄像头的视场角和所拍摄图像的分辨率,计算观看者眼睛偏离第二摄像头的光轴的第四角度;
根据所述第三角度,计算观看者眼睛偏离第一摄像头所在水平方向的第一角度,并根据所述第四角度,计算观看者眼睛偏离第二摄像头所在水平方向的第二角度。
在上述实施例的基础上,所述映射关系为反比例关系。
在上述实施例的基础上,所述排图周期宽度确定模块320是设置为:
根据如下公式,确定所述当前距离对应的目标排图周期的宽度:
pitch=p0+p0×(d÷z);
其中,pitch为排图周期的宽度;p0为常量,表示显示屏表面布设的水平柱镜的宽度;d为常量,表示水平柱镜的贴合厚度;z为观看者眼睛距离显示屏的当前距离。
本公开实施例所提供的基于人眼跟踪的裸眼3D显示装置可执行本公开任意实施例所提供的基于人眼跟踪的裸眼3D显示方法,具备执行方法相应的功能模块和效果。未在上述实施例中详尽描述的技术细节,可参见本公开任意实施例所提供的基于人眼跟踪的裸眼3D显示方法。
实施例四
图7为本公开实施例四提供的一种设备的结构示意图。图7示出了适于用来实现本公开实施方式的示例性设备12的框图。图7显示的设备12仅仅是一 个示例,不应对本公开实施例的功能和使用范围带来任何限制。
如图7所示,设备12以通用计算设备的形式表现。设备12的组件可以包括但不限于:一个或者多个处理器或者处理单元16,系统存储器28,连接不同系统组件(包括系统存储器28和处理单元16)的总线18。
总线18表示几类总线结构中的一种或多种,包括存储器总线或者存储器控制器,外围总线,图形加速端口,处理器或者使用多种总线结构中的任意总线结构的局域总线。举例来说,这些体系结构包括但不限于工业标准体系结构(Industry Standard Architecture,ISA)总线,微通道体系结构(Micro Channel Architecture,MAC)总线,增强型ISA总线、视频电子标准协会(Vedio ElectronicStandard Association,VESA)局域总线以及外围组件互连(Peripheral Component Interconnect,PCI)总线。
设备12典型地包括多种计算机系统可读介质。这些介质可以是任何能够被设备12访问的可用介质,包括易失性和非易失性介质,可移动的和不可移动的介质。
系统存储器28可以包括易失性存储器形式的计算机系统可读介质,例如随机存取存储器(RAM)30和/或高速缓存存储器32。设备12可以包括其它可移动/不可移动的、易失性/非易失性计算机系统存储介质。仅作为举例,存储系统34可以用于读写不可移动的、非易失性磁介质(图7未显示,通常称为“硬盘驱动器”)。尽管图7中未示出,可以提供用于对可移动非易失性磁盘(例如“软盘”)读写的磁盘驱动器,以及对可移动非易失性光盘(例如便携式紧凑磁盘只读存储器(Compact Disc Read-Only Memory,CD-ROM),数字视盘(Digital Video Disc-Read Only Memory,DVD-ROM)或者其它光介质)读写的光盘驱动器。在这些情况下,每个驱动器可以通过一个或者多个数据介质接口与总线18相连。 存储器28可以包括至少一个程序产品,该程序产品具有一组(例如至少一个)程序模块,这些程序模块被配置以执行本公开一个或多个实施例的功能。
具有一组(至少一个)程序模块42的程序/实用工具40,可以存储在例如存储器28中,这样的程序模块42包括但不限于操作系统、一个或者多个应用程序、其它程序模块以及程序数据,这些示例中的每一个或某种组合中可能包括网络环境的实现。程序模块42通常执行本公开所描述的实施例中的功能和/或方法。
设备12也可以与一个或多个外部设备14(例如键盘、指向设备、显示器24等)通信,还可与一个或者多个使得用户能与该设备12交互的设备通信,和/或与使得该设备12能与一个或多个其它计算设备进行通信的任何设备(例如网卡,调制解调器等等)通信。这种通信可以通过输入/输出(Input/Output,I/O)接口22进行。并且,设备12还可以通过网络适配器20与一个或者多个网络(例如局域网(Local Area Network,LAN),广域网(Wide Area Network,WAN)和/或公共网络,例如因特网)通信。如图所示,网络适配器20通过总线18与设备12的其它模块通信。应当明白,尽管图中未示出,可以结合设备12使用其它硬件和/或软件模块,包括但不限于:微代码、设备驱动器、冗余处理单元、外部磁盘驱动阵列、磁盘阵列(Redundant Arrays of Independent Drives,RAID)系统、磁带驱动器以及数据备份存储系统等。
处理单元16通过运行存储在系统存储器28中的程序,从而执行一种或多种功能应用以及数据处理,例如实现本公开实施例所提供的基于人眼跟踪的裸眼3D显示方法。该方法包括:
根据显示屏上设置的第一摄像头和第二摄像头以及观看者眼睛之间的几何关系,确定观看者眼睛距离显示屏的当前距离;
基于预先设定的观看者眼睛距离显示屏的距离与排图周期的宽度之间的映射关系,确定所述当前距离对应的目标排图周期的宽度;
按照所述目标排图周期的宽度对排图内容进行调整并显示。
在一实施例中,处理器还可以实现本公开任意实施例所提供的基于人眼跟踪的裸眼3D显示方法的技术方案。
实施例五
本公开实施例五还提供了一种计算机可读存储介质,其上存储有计算机程序,该程序被处理器执行时实现本公开任意实施例所提供的基于人眼跟踪的裸眼3D显示方法。该方法包括:
根据显示屏上设置的第一摄像头和第二摄像头以及观看者眼睛之间的几何关系,确定观看者眼睛距离显示屏的当前距离;
基于预先设定的观看者眼睛距离显示屏的距离与排图周期的宽度之间的映射关系,确定所述当前距离对应的目标排图周期的宽度;
按照所述目标排图周期的宽度对排图内容进行调整并显示。
本公开实施例的计算机存储介质,可以采用一个或多个计算机可读的介质的任意组合。计算机可读介质可以是计算机可读信号介质或者计算机可读存储介质。计算机可读存储介质例如可以是——但不限于——电、磁、光、电磁、红外线、或半导体的系统、装置或器件,或者任意以上的组合。计算机可读存储介质(非穷举的列表)包括:具有一个或多个导线的电连接、便携式计算机磁盘、硬盘、随机存取存储器(Random Access Memory,RAM)、只读存储器(Read-Only Memory,ROM)、可擦式可编程只读存储器(Erasable Programmable Read-Only Memory,EPROM)或闪存、光纤、便携式紧凑磁盘只读存储器 (CD-ROM)、光存储器件、磁存储器件、或者上述的任意合适的组合。在本文件中,计算机可读存储介质可以是任何包含或存储程序的有形介质,该程序可以被指令执行系统、装置或者器件使用或者与其结合使用。
计算机可读的信号介质可以包括在基带中或者作为载波一部分传播的数据信号,其中承载了计算机可读的程序代码。这种传播的数据信号可以采用多种形式,包括但不限于电磁信号、光信号或上述的任意合适的组合。计算机可读的信号介质还可以是计算机可读存储介质以外的任何计算机可读介质,该计算机可读介质可以发送、传播或者传输用于由指令执行系统、装置或者器件使用或者与其结合使用的程序。
计算机可读介质上包含的程序代码可以用任何适当的介质传输,包括——但不限于无线、电线、光缆、射频(Radio Frequency,RF)等等,或者上述的任意合适的组合。
可以以一种或多种程序设计语言或其组合来编写用于执行本公开操作的计算机程序代码,所述程序设计语言包括面向对象的程序设计语言-诸如Java、Smalltalk、C++,还包括常规的过程式程序设计语言-诸如”C”语言或类似的程序设计语言。程序代码可以完全地在用户计算机上执行、部分地在用户计算机上执行、作为一个独立的软件包执行、部分在用户计算机上部分在远程计算机上执行、或者完全在远程计算机或服务器上执行。在涉及远程计算机的情形中,远程计算机可以通过任意种类的网络——包括局域网(LAN)或广域网(WAN)-连接到用户计算机,或者,可以连接到外部计算机(例如利用因特网服务提供商来通过因特网连接)。

Claims (10)

  1. 一种基于人眼跟踪的裸眼三维3D显示方法,包括:
    根据显示屏上设置的第一摄像头和第二摄像头以及观看者眼睛之间的几何关系,确定所述观看者眼睛距离所述显示屏的当前距离;
    基于预先设定的观看者眼睛距离显示屏的距离与排图周期的宽度之间的映射关系,确定所述当前距离对应的目标排图周期的宽度;
    按照所述目标排图周期的宽度对排图内容进行调整并显示。
  2. 根据权利要求1所述的方法,其中,根据显示屏上设置的第一摄像头和第二摄像头以及观看者眼睛之间的几何关系,确定所述观看者眼睛距离所述显示屏的当前距离,包括:
    计算所述观看者眼睛偏离所述第一摄像头所在水平方向的第一角度,以及所述观看者眼睛偏离所述第二摄像头所在水平方向的第二角度;
    根据所述第一角度、所述第二角度以及所述第一摄像头和所述第二摄像头的间距计算所述观看者眼睛距离所述显示屏的当前距离。
  3. 根据权利要求2所述的方法,其中,计算所述观看者眼睛偏离所述第一摄像头所在水平方向的第一角度,以及所述观看者眼睛偏离所述第二摄像头所在水平方向的第二角度,包括:
    根据所述观看者眼睛分别在所述第一摄像头和所述第二摄像头中的成像坐标,确定所述观看者眼睛距离所述第一摄像头的光轴的第一水平距离,以及所述观看者眼睛距离所述第二摄像头的光轴的第二水平距离;
    根据所述第一水平距离、所述第一摄像头的视场角和所拍摄图像的分辨率,计算所述观看者眼睛偏离所述第一摄像头的光轴的第三角度;
    根据所述第二水平距离、所述第二摄像头的视场角和所拍摄图像的分辨率,计算所述观看者眼睛偏离所述第二摄像头的光轴的第四角度;
    根据所述第三角度,计算所述观看者眼睛偏离所述第一摄像头所在水平方向的第一角度,并根据所述第四角度,计算所述观看者眼睛偏离所述第二摄像头所在水平方向的第二角度。
  4. 根据权利要求1所述的方法,其中,所述映射关系为反比例关系。
  5. 根据权利要求1-4任一项所述的方法,其中,基于预先设定的观看者眼睛距离显示屏的距离与排图周期的宽度之间的映射关系,确定所述当前距离对应的目标排图周期的宽度,包括:
    根据如下公式,确定所述当前距离对应的目标排图周期的宽度:
    pitch=p0+p0×(d÷z);
    其中,pitch为排图周期的宽度;p0为常量,表示所述显示屏表面布设的水平柱镜的宽度;d为常量,表示所述水平柱镜的贴合厚度;z为所述观看者眼睛距离所述显示屏的当前距离。
  6. 一种基于人眼跟踪的裸眼三维3D显示装置,包括:
    当前距离确定模块,设置为根据显示屏上设置的第一摄像头和第二摄像头以及观看者眼睛之间的几何关系,确定所述观看者眼睛距离所述显示屏的当前距离;
    排图周期宽度确定模块,设置为基于预先设定的观看者眼睛距离显示屏的距离与排图周期的宽度之间的映射关系,确定所述当前距离对应的目标排图周期的宽度;
    排图内容调整模块,设置为按照所述目标排图周期的宽度对排图内容进行调整并显示。
  7. 根据权利要求6所述的装置,其中,所述当前距离确定模块,包括:
    角度计算单元,设置为计算所述观看者眼睛偏离所述第一摄像头所在水平 方向的第一角度,以及偏离所述第二摄像头所在水平方向的第二角度;
    当前距离计算单元,设置为根据所述第一角度、所述第二角度以及所述第一摄像头和所述第二摄像头的间距计算所述观看者眼睛距离所述显示屏的当前距离。
  8. 根据权利要求7所述的装置,其中,所述角度计算单元是设置为:
    根据所述观看者眼睛分别在所述第一摄像头和所述第二摄像头中的成像坐标,确定所述观看者眼睛距离所述第一摄像头的光轴的第一水平距离,以及所述观看者眼睛距离所述第二摄像头的光轴的第二水平距离;
    根据所述第一水平距离、所述第一摄像头的视场角和所拍摄图像的分辨率,计算所述观看者眼睛偏离所述第一摄像头的光轴的第三角度;
    根据所述第二水平距离、所述第二摄像头的视场角和所拍摄图像的分辨率,计算所述观看者眼睛偏离所述第二摄像头的光轴的第四角度;
    根据所述第三角度,计算所述观看者眼睛偏离所述第一摄像头所在水平方向的第一角度,并根据所述第四角度,计算所述观看者眼睛偏离所述第二摄像头所在水平方向的第二角度。
  9. 一种设备,包括:
    一个或多个处理器;
    存储装置,设置为存储一个或多个程序,
    当所述一个或多个程序被所述一个或多个处理器执行,使得所述一个或多个处理器实现如权利要求1-5中任一项所述的基于人眼跟踪的裸眼三维3D显示方法。
  10. 一种计算机可读存储介质,其上存储有计算机程序,该程序被处理器执行时实现如权利要求1-5中任一项所述的基于人眼跟踪的裸眼三维3D显示方法。
PCT/CN2018/111648 2018-07-23 2018-10-24 基于人眼跟踪的裸眼3d显示方法、装置、设备和介质 WO2020019548A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810813867.7 2018-07-23
CN201810813867.7A CN108881893A (zh) 2018-07-23 2018-07-23 基于人眼跟踪的裸眼3d显示方法、装置、设备和介质

Publications (1)

Publication Number Publication Date
WO2020019548A1 true WO2020019548A1 (zh) 2020-01-30

Family

ID=64304353

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/111648 WO2020019548A1 (zh) 2018-07-23 2018-10-24 基于人眼跟踪的裸眼3d显示方法、装置、设备和介质

Country Status (2)

Country Link
CN (1) CN108881893A (zh)
WO (1) WO2020019548A1 (zh)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109743564A (zh) * 2019-01-31 2019-05-10 深圳市维尚境界显示技术有限公司 一种裸眼3d动态调节显示屏排图方法及电子设备
WO2020252738A1 (zh) * 2019-06-20 2020-12-24 深圳市立体通科技有限公司 一种裸眼3d显示屏3d参数手动校准方法及电子设备
CN110674715B (zh) * 2019-09-16 2022-02-18 宁波视睿迪光电有限公司 基于rgb图像的人眼跟踪方法及装置
CN112929642A (zh) * 2019-12-05 2021-06-08 北京芯海视界三维科技有限公司 人眼追踪装置、方法及3d显示设备、方法
CN113660480B (zh) * 2021-08-16 2023-10-31 纵深视觉科技(南京)有限责任公司 一种环视功能实现方法、装置、电子设备及存储介质
CN113867526A (zh) * 2021-09-17 2021-12-31 纵深视觉科技(南京)有限责任公司 一种基于人眼追踪的优化显示方法、装置、设备及介质
CN114584754B (zh) * 2022-02-28 2023-12-26 广东未来科技有限公司 3d显示方法及相关装置

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120236118A1 (en) * 2011-03-17 2012-09-20 Chi Mei Communication Systems, Inc. Electronic device and method for automatically adjusting viewing angle of 3d images
CN105959674A (zh) * 2016-05-20 2016-09-21 京东方科技集团股份有限公司 3d显示装置及3d显示方法
CN107885325A (zh) * 2017-10-23 2018-04-06 上海玮舟微电子科技有限公司 一种基于人眼跟踪的裸眼3d显示方法及控制系统
CN108174182A (zh) * 2017-12-30 2018-06-15 上海易维视科技股份有限公司 三维跟踪式裸眼立体显示视区调整方法及显示系统

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103248905A (zh) * 2013-03-22 2013-08-14 深圳市云立方信息科技有限公司 一种模仿全息3d场景的显示装置和视觉显示方法
CN104111059A (zh) * 2014-07-16 2014-10-22 宇龙计算机通信科技(深圳)有限公司 一种测距和定位装置、方法及终端
CN104618705B (zh) * 2014-11-28 2017-04-05 深圳市魔眼科技有限公司 基于眼球追踪的不同距离自适应全息显示方法及设备
CN106713894B (zh) * 2015-11-17 2019-06-18 深圳超多维科技有限公司 一种跟踪式立体显示方法及设备
CN106813649A (zh) * 2016-12-16 2017-06-09 北京远特科技股份有限公司 一种图像测距定位的方法、装置及adas
CN106597424A (zh) * 2016-12-22 2017-04-26 惠州Tcl移动通信有限公司 一种基于双摄像头的测距方法、系统及移动终端
CN107172417B (zh) * 2017-06-30 2019-12-20 深圳超多维科技有限公司 一种裸眼3d屏幕的图像显示方法、装置及系统
CN107343193B (zh) * 2017-07-31 2019-08-06 深圳超多维科技有限公司 一种裸眼立体显示方法、装置及设备

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120236118A1 (en) * 2011-03-17 2012-09-20 Chi Mei Communication Systems, Inc. Electronic device and method for automatically adjusting viewing angle of 3d images
CN105959674A (zh) * 2016-05-20 2016-09-21 京东方科技集团股份有限公司 3d显示装置及3d显示方法
CN107885325A (zh) * 2017-10-23 2018-04-06 上海玮舟微电子科技有限公司 一种基于人眼跟踪的裸眼3d显示方法及控制系统
CN108174182A (zh) * 2017-12-30 2018-06-15 上海易维视科技股份有限公司 三维跟踪式裸眼立体显示视区调整方法及显示系统

Also Published As

Publication number Publication date
CN108881893A (zh) 2018-11-23

Similar Documents

Publication Publication Date Title
WO2020019548A1 (zh) 基于人眼跟踪的裸眼3d显示方法、装置、设备和介质
CN107885325B (zh) 一种基于人眼跟踪的裸眼3d显示方法及控制系统
WO2020029373A1 (zh) 人眼空间位置的确定方法、装置、设备和存储介质
CA2888943C (en) Augmented reality system and method for positioning and mapping
US9983546B2 (en) Display apparatus and visual displaying method for simulating a holographic 3D scene
US9191661B2 (en) Virtual image display device
US9848184B2 (en) Stereoscopic display system using light field type data
US20230269358A1 (en) Methods and systems for multiple access to a single hardware data stream
US10650573B2 (en) Synthesizing an image from a virtual perspective using pixels from a physical imager array weighted based on depth error sensitivity
US9813693B1 (en) Accounting for perspective effects in images
KR101903619B1 (ko) 구조화된 스테레오
US20150009119A1 (en) Built-in design of camera system for imaging and gesture processing applications
CN108989785B (zh) 基于人眼跟踪的裸眼3d显示方法、装置、终端和介质
US9996960B2 (en) Augmented reality system and method
WO2022228119A1 (zh) 图像采集方法、装置、电子设备及介质
CN107635132B (zh) 裸眼3d显示终端的显示控制方法、装置及显示终端
KR102176963B1 (ko) 수평 시차 스테레오 파노라마를 캡쳐하는 시스템 및 방법
WO2018161564A1 (zh) 手势识别系统、方法及显示设备
US10296098B2 (en) Input/output device, input/output program, and input/output method
US20190102945A1 (en) Imaging device and imaging method for augmented reality apparatus
KR20110079969A (ko) 디스플레이 장치 및 그 제어방법
CN111857461B (zh) 图像显示方法、装置、电子设备及可读存储介质
CN113973199A (zh) 可透光显示系统及其图像输出方法与处理装置
Zhou et al. Analysis and practical minimization of registration error in a spherical fish tank virtual reality system
WO2019100547A1 (zh) 投影控制方法、装置、投影交互系统及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18927505

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18927505

Country of ref document: EP

Kind code of ref document: A1