WO2022228451A1 - Procédé, appareil et dispositif de traitement d'affichage basé sur le suivi de l'œil humain, et support de stockage - Google Patents

Procédé, appareil et dispositif de traitement d'affichage basé sur le suivi de l'œil humain, et support de stockage Download PDF

Info

Publication number
WO2022228451A1
WO2022228451A1 PCT/CN2022/089464 CN2022089464W WO2022228451A1 WO 2022228451 A1 WO2022228451 A1 WO 2022228451A1 CN 2022089464 W CN2022089464 W CN 2022089464W WO 2022228451 A1 WO2022228451 A1 WO 2022228451A1
Authority
WO
WIPO (PCT)
Prior art keywords
stereoscopic effect
view
display screen
viewpoint
eyes
Prior art date
Application number
PCT/CN2022/089464
Other languages
English (en)
Chinese (zh)
Inventor
夏正国
Original Assignee
纵深视觉科技(南京)有限责任公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 纵深视觉科技(南京)有限责任公司 filed Critical 纵深视觉科技(南京)有限责任公司
Priority to CN202280003327.7A priority Critical patent/CN115398889A/zh
Publication of WO2022228451A1 publication Critical patent/WO2022228451A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof

Definitions

  • the embodiments of the present application relate to the field of naked-eye 3D (3-dimensional, three-dimensional) technologies, for example, to a display processing method, apparatus, device, and storage medium based on human eye tracking.
  • the human eye receives two images in the correct order, thereby producing a stereoscopic effect.
  • the human eye receives two images in the wrong order at a certain position, for example, the content seen by the left and right eyes corresponds to the right and left viewpoints respectively, it will cause eye dizziness and cause a wrong stereo vision experience.
  • Embodiments of the present application provide a display processing method, apparatus, device, and storage medium based on human eye tracking.
  • an embodiment of the present application provides a display processing method based on human eye tracking, including:
  • the eyes of the same target viewer are controlled to be located in the same multi-viewpoint stereoscopic effect visual area formed by the light-splitting action of the light-splitting device on the multi-viewpoint display screen;
  • the multiple viewpoints sequentially distributed in each multi-viewpoint stereoscopic effect visual area correspond one-to-one with multiple sequentially arranged views in the multi-viewpoint display screen.
  • an embodiment of the present application further provides a display processing device based on human eye tracking, including:
  • the position determination module is set to obtain the human eye space position of the target viewer in front of the multi-viewpoint display screen
  • the viewing area adjustment module is arranged to control the eyes of the same target viewer to be located in the same multi-view stereoscopic effect visible area formed by the light-splitting effect of the light-splitting device according to the described human eye space position;
  • the multiple viewpoints sequentially distributed in each multi-viewpoint stereoscopic effect visual area correspond one-to-one with multiple sequentially arranged views in the multi-viewpoint display screen.
  • the embodiments of the present application also provide an electronic device, including:
  • processors one or more processors
  • storage means arranged to store one or more programs
  • the one or more programs are executed by the one or more processors, so that the one or more processors implement the display processing method based on human eye tracking as provided in any embodiment of the present application.
  • the embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by the processor, the implementation is as in any embodiment of the present application.
  • the provided display processing method based on human eye tracking.
  • FIG. 1 is a flowchart of a display processing method based on human eye tracking provided in an embodiment of the present application
  • FIG. 2 is a top view of a scene of a multi-viewpoint display provided in an embodiment of the present application
  • FIG. 3 is a flowchart of another display processing method based on human eye tracking provided in an embodiment of the present application.
  • FIG. 4 is a top view of a multi-viewpoint display for performing human eye tracking on two eyes provided in an embodiment of the present application
  • FIG. 5 is a top view of a multi-viewpoint display for performing human eye tracking on two eyes under a relative distance provided in an embodiment of the present application;
  • FIG. 6 is a top view of a multi-viewpoint display for performing human eye tracking on two eyes under another relative distance provided in the embodiment of the present application;
  • FIG. 7 is a top view of another multi-viewpoint display for performing human eye tracking on two eyes at a relative distance provided in an embodiment of the present application;
  • FIG. 8 is a structural block diagram of a display processing device based on human eye tracking provided in an embodiment of the present application.
  • FIG. 9 is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
  • FIG. 1 is a flowchart of a display processing method based on human eye tracking provided in an embodiment of the present application.
  • the embodiment of the present application can perform eye tracking adjustment when performing multi-view naked-eye 3D display.
  • the method can be executed by a display processing device based on human eye tracking, and the device can be implemented by means of software and/or hardware, and can be integrated on any electronic device with a network communication function.
  • the display processing method based on human eye tracking provided in the embodiment of the present application may include the following steps:
  • the optical device refracts the different view contents to different places in the space through the refraction of the light to reach the different view contents in front of the human eyes. It is separated, and each view corresponds to a viewpoint in the space, realizing multi-viewpoint display of the display screen.
  • the light splitting device may be a view splitter or an image splitting device, such as a "slit" type grating (parallax barrier). In this way, at a suitable distance in front of the display screen, one eye can see one of the multiple views in turn, so that the human eye receives two images with parallax and produces a stereoscopic effect.
  • one or more cameras are set on the multi-view display screen to capture images of the viewer in front of the multi-view display screen, and the captured images collected by the cameras are analyzed to establish position coordinates in front of the multi-view display screen.
  • the system calculates the human eye space position of the target viewer in front of the multi-view display screen.
  • the multiple viewpoints sequentially distributed in each multi-viewpoint stereoscopic effect visual area correspond one-to-one with multiple sequentially arranged views in the multi-viewpoint display screen.
  • N views are arranged in the display screen, for example, N can be 8, the human eye sees the first view to the Nth view sequentially from left to right at a certain distance, and then moves to see the repeated view again.
  • the first view to the Nth view that is, multiple repeated regions from the first view to the Nth view will be formed in front of the display screen, and each region is denoted as a multi-view stereoscopic effect visual area.
  • each multi-viewpoint stereoscopic effect viewing area may include a naked-eye stereoscopic viewing effect that satisfies a preset condition and is parallel to the multi-viewpoint display screen for multiple views arranged in sequence in the multi-viewpoint display screen during human eye tracking. Spatial area of the display.
  • each multi-viewpoint stereoscopic effect viewing area may include a limited horizontal plane parallel to the display screen and capable of presenting a naked-eye stereoscopic viewing effect satisfying a preset condition to multiple views arranged in sequence in the multi-viewpoint display screen, and A space formed by a limited horizontal plane within a preset distance range from the limited plane area parallel to the display screen with the naked-eye stereoscopic viewing effect satisfying the preset conditions.
  • each viewpoint has a corresponding view among the multiple views arranged in sequence in the multi-viewpoint display screen, so that any of the above-mentioned multi-viewpoint stereoscopic effects In the viewing area, one view of the multiple views arranged in the display screen can be seen sequentially from each viewpoint through the viewer's single eye.
  • each multi-viewing stereoscopic effect viewing area in the direction of sequential distribution of each viewpoint is greater than the distance between the eyes of the target viewer, for example See cone1 shown in FIG. 2 .
  • the cone width of each multi-view stereoscopic effect viewing area is usually larger than the head width, and since the viewer's binocular span is usually larger than the width of a single view, the viewer's head only needs to be in any one of the multi-view stereoscopic effect viewing area. , the eyes can see two view images with parallax in the correct order, resulting in a better stereoscopic display effect.
  • the sequentially distributed multiple viewpoints in each multi-viewpoint stereoscopic effect visual area correspond one-to-one with the sequentially arranged multiple views in the multi-viewpoint display screen, that is, multiple viewpoints in each multi-viewpoint stereoscopic effect visual area.
  • the order of viewpoints is the same as the order of multiple views corresponding to multiple viewpoints.
  • the spatial distribution position of the multi-view stereoscopic effect visual area of the multi-view display screen formed by the light-splitting effect of the light-splitting device is dynamically adjusted, so that the viewer's eyes can avoid the two viewpoints.
  • the junction position of the stereoscopic effect visual area enables the viewer to include both eyes in the same multi-view stereoscopic effect visual area at any position, so as to have the correct stereoscopic effect when viewing.
  • the position of each multi-viewpoint stereoscopic effect viewable area formed by the light splitting action of the light splitting device on the display screen will change with the change of the position of the viewer's eyes, so as to realize The user can place the same eyes in the same visual area at any position, so as to avoid the same eye spanning the junction of two adjacent multi-view stereoscopic effect visual areas and causing the human eye to see two view images in the wrong order.
  • the stereoscopic visual experience is affected, and the 3D stereoscopic visual effect can be viewed at any position in front of the display.
  • the embodiment of the present application provides a display processing method based on human eye tracking.
  • the multi-viewpoint display screen performs image display
  • the human eye space position of the target viewer in front of the multi-viewpoint display screen is obtained, and based on the target viewer's eye space position
  • the human eye space position control incorporates the eyes of the target viewer into the same multi-viewpoint stereoscopic effect visual area formed by the light-splitting effect of the light-splitting device on the multi-viewpoint display screen.
  • FIG. 3 is a flowchart of another display processing method based on human eye tracking provided in an embodiment of the present application.
  • the embodiments of the present application are optimized on the basis of the foregoing embodiments, and the embodiments of the present application may be combined with each optional solution in one or more of the foregoing embodiments.
  • the display processing method based on human eye tracking provided in the embodiment of the present application may include the following steps:
  • determining the human eye space position of the target viewer in front of the multi-viewpoint display screen may include the following operations:
  • eye tracking technology is employed to track the spatial position of each target viewer's eye.
  • the solution of the present application has strong adaptability to the scene of one or two viewers, but is suitable for three or more viewers.
  • the viewer's scene adaptability is low.
  • the horizontally distributed 1 to n views circulate in a space with relatively clean light, for example, where the line segments AB, BC... in Figure 2 are located Such a space.
  • the eyes are located in such a space and within a certain range of narrow space before and after it, that is, the eyes are located in the same multi-view stereoscopic effect visual area, and the correct 3D effect can always be experienced.
  • the multi-viewpoint stereoscopic effect of the multi-viewpoint display screen formed by the light-splitting effect of the light-splitting device is not fixed. Adjusting the arrangement of the pixel content behind the light-splitting device in the multi-viewpoint display screen can change the multi-viewpoint stereoscopic effect.
  • the spatial distribution position of the visual area of the effect within a relatively wide range, the visual area of the multi-view stereo effect can be moved back and forth, left and right, so that people's eyes can avoid crossing the junction of the two visual areas of the multi-view stereo effect.
  • the surface can watch the naked-eye 3D effect in any position.
  • the multiple viewpoints sequentially distributed in each multi-viewpoint stereoscopic effect visual area correspond one-to-one with multiple sequentially arranged views in the multi-viewpoint display screen.
  • controlling the eyes of the same target viewer to be located in the same multi-view stereoscopic effect visual area may include the following operations:
  • the center position of the same multi-viewpoint stereoscopic effect viewable area is moved to the center of the eyes of the same target viewer. position to align.
  • the eye-tracking system of the target viewer obtains the spatial position of the target viewer's eyes in real time.
  • the arrangement of the pixel content behind the spectroscopic device in the multi-viewpoint display screen is adjusted by the layout algorithm, and the multi-viewpoint stereoscopic effect visual area of the light space distribution formed by the light-splitting effect of the spectroscopic device is moved in real time on the multi-viewpoint display screen, so that the The middle position of the visual area of the same multi-view stereo effect is aligned with the position of the eyes of the target viewer.
  • controlling the eyes of the same target viewer to be located in the same multi-view stereoscopic effect visual area may include the following operation steps A1-A2:
  • Step A1 In response to the presence of two target viewers in front of the multi-view display screen, determine the relative positions of the two target viewers in front of the multi-view display screen.
  • Step A2 according to the relative position between the two target viewers, by adjusting the spatial distribution positions of at least two multi-viewpoint stereoscopic effect visual areas, the eyes of the same target viewer are controlled in the corresponding preset adjustment methods of the relative positions.
  • the same multi-view stereoscopic effect visual area is controlled by adjusting the spatial distribution positions of at least two multi-viewpoint stereoscopic effect visual areas.
  • FIG. 4 similar to the case where a single viewer performs human eye tracking, when two viewers are eye-tracked, the eyes of the viewers are placed in the visual area of the multi-view stereo effect, thereby avoiding two The position of the junction of the visual area of the stereoscopic effect of the viewpoint to achieve human eye tracking.
  • the human eye space positions of the two target viewers in front of the multi-viewpoint display screen are respectively tracked through the human eye detection and tracking algorithm, and the relative distance between the two pairs of eyes of the two target viewers needs to be analyzed to take different movements. Adjust the mode so that each eye avoids the junction of the viewable area of the multi-view stereo effect.
  • the two targets in response to the relative position between the two target viewers in front of the multi-view display screen being smaller than the first preset multiple of the width of the multi-view stereoscopic effect visual area, the two targets The eyes of the viewer are arranged in a multi-view stereoscopic visual area.
  • the first preset multiple of the width of the viewable area of the multi-view stereo effect is cone/2
  • the value of cone is the width of the view area of the multi-view stereo effect along the direction of the sequential distribution of each view point.
  • the eyes of the two target viewers are respectively arranged in two adjacent multi-view stereoscopic effect viewing areas. For example, when cone/2 ⁇ relative distance between two target viewers ⁇ cone*3/2, the eyes of the two target viewers can be moved to two adjacent multi-view stereoscopic effect visual areas respectively .
  • the second preset multiple is 3/2.
  • the two The eyes of the target viewer are respectively arranged in two multi-view stereoscopic effect visual areas separated by at least one multi-view stereoscopic effect visual area.
  • the visibility between the two multi-view stereoscopic effect viewing areas for accommodating the eyes of the two target viewers is determined. The number of area intervals, so that the eyes of the two target viewers are respectively arranged in the two multi-view stereoscopic effect viewing areas of the multi-view stereoscopic effect viewing area separated by the number of viewing area intervals.
  • the eyes of the two target viewers are respectively arranged in two multi-view stereoscopic effect viewing areas separated by one multi-view stereoscopic effect viewing area. For example, when cone*3/2 ⁇ relative distance between the two target viewers ⁇ cone*5/2, the eyes of the two target viewers can be moved to two different viewing areas of the multi-view stereoscopic effect. A multi-view stereoscopic effect visual area.
  • the third preset multiple is 5/2.
  • the binocular tracking of two target viewers can always be placed in two multi-view stereoscopic effect viewing areas separated by at least one multi-view stereoscopic effect viewing area, and the eyes of each viewer can be avoided.
  • the boundary line of the visible area of the multi-view stereo effect can be avoided.
  • the front-to-back distance between the two target viewers when the front-to-back distance between the two target viewers is large, if the multi-view stereoscopic effect viewing area moves to one of the two target viewers in the front-rear direction relative to the multi-view display screen In terms of position, the 3D experience of the other viewer of the two target viewers will be worse, so it is suitable for the situation where the distance between the two viewers is relatively close. Obtain a normal 3D effect, otherwise, when the front-to-back distance between the two target viewers is large, one party's 3D experience will always be poor.
  • two viewers it is valid for two viewers to have a relative distance or movement in a direction parallel to the display screen, and for one of the viewers to move in a direction perpendicular to the screen (far-near direction) or two viewers in a direction perpendicular to the display screen. Moves at different distances in the screen orientation, cannot be applied.
  • the position of at least one multi-viewpoint stereoscopic effect viewable area formed by the light splitting action of the light splitting device on the display screen will change with the position of the viewer's human eyes. Realize that the user can put the same eyes in the same visual area at any position, and avoid the same eye spanning the junction of two adjacent multi-view stereoscopic effect visual areas, causing the human eye to see two view images in the wrong order. , and then affect the stereoscopic visual experience, so that the 3D stereoscopic visual effect can be viewed at any position in front of the display.
  • FIG. 8 is a structural block diagram of a display processing apparatus based on human eye tracking provided in an embodiment of the present application.
  • the embodiment of the present application can perform eye tracking adjustment when performing multi-view naked-eye 3D display.
  • the display processing apparatus based on human eye tracking can be implemented in software and/or hardware, and can be integrated in any electronic device with network communication function.
  • the display processing apparatus based on human eye tracking provided in this embodiment of the present application may include the following: a position determination module 810 and a view area adjustment module 820 . in:
  • the position determination module 810 is configured to obtain the human eye space position of the target viewer in front of the multi-viewpoint display screen.
  • the viewing area adjustment module 820 is configured to control the eyes of the same target viewer to be located in the same multi-view stereoscopic effect visual area formed by the light-splitting effect of the light-splitting device on the multi-view display screen according to the spatial position of the human eyes.
  • the multiple viewpoints sequentially distributed in each multi-viewpoint stereoscopic effect visual area correspond one-to-one with multiple sequentially arranged views in the multi-viewpoint display screen.
  • the width of each multi-viewpoint stereoscopic effect viewable area is larger than the binocular distance of the target viewer in the direction along which the multiple viewpoints are sequentially distributed.
  • the multi-viewpoint stereoscopic effect visual area includes a view that can meet preset conditions for displaying multiple views arranged in sequence in the multi-viewpoint display screen during human eye tracking.
  • the naked-eye stereoscopic viewing effect is parallel to the spatial area of the multi-view display screen.
  • the location determination module 810 is set to:
  • eye tracking technology is employed to track the spatial position of each target viewer's eye.
  • the viewing area adjustment module 820 is set to:
  • the spatial distribution position of at least two multi-view stereoscopic effect visual areas formed by the light splitting effect of the multi-view display screen is adjusted, and the eyes of the same target viewer are controlled to be in the same multi-view stereo effect. viewport.
  • the viewing area adjustment module 820 is set to:
  • the center position of the same multi-viewpoint stereoscopic effect viewable area is moved to the center of the eyes of the same target viewer. position to align.
  • the viewing area adjustment module 820 is set to:
  • At least two multi-viewing stereoscopic effect viewing areas are controlled to achieve the following adjustment modes:
  • the eyes of the two target viewers are arranged in one multi-viewing stereoscopic effect viewing area;
  • the eyes of the two target viewers are respectively arranged on the opposite sides.
  • the eyes of the two target viewers are respectively arranged in two multi-viewing stereoscopic effect viewing areas separated by at least one multi-viewing stereoscopic effect viewing area. viewport.
  • the display processing device based on eye tracking provided in the embodiment of the present application can execute the display processing method based on eye tracking provided in any of the embodiments of the present application, and has the corresponding display processing method for executing the display processing method based on eye tracking.
  • the display processing method based on human eye tracking provided in any embodiment of the present application.
  • FIG. 9 is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
  • the electronic device provided in this embodiment of the present application includes: one or more processors 910 and a storage device 920 ; the number of processors 910 in the electronic device may be one or more.
  • the processor 910 is taken as an example; the storage device 920 is used to store one or more programs; the one or more programs are executed by the one or more processors 910, so that the one or more processors 910 implement the The display processing method based on human eye tracking described in any one of the application embodiments.
  • the electronic device may further include: an input device 930 and an output device 940 .
  • the processor 910 , the storage device 920 , the input device 930 and the output device 940 in the electronic device may be connected by a bus or in other ways, and the connection by a bus is taken as an example in FIG. 9 .
  • the storage device 920 in the electronic device can be used to store one or more programs, and the programs can be software programs, computer-executable programs, and modules, as provided in the embodiments of the present application.
  • the processor 910 executes various functional applications and data processing of the electronic device by running the software programs, instructions and modules stored in the storage device 920, that is, to implement the display processing method based on human eye tracking in the above method embodiments.
  • the storage device 920 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the electronic device, and the like. Additionally, storage device 920 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid-state storage device. In some examples, storage device 920 may further include memory located remotely from processor 910, which may be connected to the device through a network. Examples of such networks include, but are not limited to, the Internet, an intranet, a local area network, a mobile communication network, and combinations thereof.
  • the input device 930 may be used to receive input numerical or character information, and generate key signal input related to user setting and function control of the electronic device.
  • the output device 940 may include a display device such as a display screen, for example, a multi-view display screen.
  • the eyes of the same target viewer are controlled to be located in the same multi-viewpoint stereoscopic effect visual area formed by the light-splitting action of the light-splitting device on the multi-viewpoint display screen;
  • the multiple viewpoints sequentially distributed in each multi-viewpoint stereoscopic effect visual area correspond one-to-one with multiple sequentially arranged views in the multi-viewpoint display screen.
  • An embodiment of the present application provides a computer-readable storage medium on which a computer program is stored, and when the program is executed by a processor, is used to execute a display processing method based on human eye tracking, and the method includes:
  • the eyes of the same target viewer are controlled to be located in the same multi-viewpoint stereoscopic effect visual area formed by the light-splitting action of the light-splitting device on the multi-viewpoint display screen;
  • the multiple viewpoints sequentially distributed in each multi-viewpoint stereoscopic effect visual area correspond one-to-one with multiple sequentially arranged views in the multi-viewpoint display screen.
  • the program when the program is executed by the processor, the program may also be used to execute the display processing method based on human eye tracking provided in any embodiment of the present application.
  • the computer storage medium of the embodiments of the present application may adopt any combination of one or more computer-readable media.
  • the computer-readable medium may be a computer-readable signal medium or a computer-readable storage medium.
  • the computer-readable storage medium can be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus or device, or a combination of any of the above.
  • Computer readable storage media include: electrical connections with one or more wires, portable computer disks, hard disks, random access memory (RAM), read only memory (Read Only Memory, ROM), Erasable Programmable Read Only Memory (EPROM), Flash Memory, Optical Fiber, Portable CD-ROM (Compact Disc Read Only Memory, Read Only Memory), Optical Storage Devices, Magnetic memory device, or any suitable combination of the above.
  • a computer-readable storage medium can be any tangible medium that contains or stores a program that can be used by or in connection with an instruction execution system, apparatus, or device.
  • a computer-readable signal medium may include a propagated data signal in baseband or as part of a carrier wave, with computer-readable program code embodied thereon. Such propagated data signals may take a variety of forms including, but not limited to, electromagnetic signals, optical signals, or any suitable combination of the foregoing.
  • a computer-readable signal medium can also be any computer-readable medium other than a computer-readable storage medium that can transmit, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device .
  • Program code embodied on a computer-readable medium may be transmitted using any suitable medium, including but not limited to: wireless, wire, optical fiber cable, radio frequency (RF), etc., or any suitable combination of the foregoing.
  • suitable medium including but not limited to: wireless, wire, optical fiber cable, radio frequency (RF), etc., or any suitable combination of the foregoing.
  • Computer program code for performing the operations of the present application may be written in one or more programming languages, including object-oriented programming languages—such as Java, Smalltalk, C++, but also conventional Procedural programming language - such as the "C" language or similar programming language.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or Wide Area Network (WAN), or it may be connected to an external computer (eg using an internet service provider to connect via the internet).
  • LAN Local Area Network
  • WAN Wide Area Network

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

Des modes de réalisation de la présente invention concernent un procédé, un appareil et un dispositif de traitement d'affichage basé sur le suivi de l'œil humain, et un support de stockage. Le procédé comprend les étapes suivantes : acquisition d'une position spatiale de l'œil humain d'un spectateur cible devant un écran d'affichage à vues multiples ; et en fonction de la position spatiale de l'œil humain, commande des deux yeux du même spectateur cible pour qu'ils soient situés dans une même zone visuelle à effet stéréoscopique à vues multiples formée par un effet de division de lumière de l'écran d'affichage à vues multiples au moyen d'un dispositif de division de lumière. Une pluralité de points de vue qui sont répartis séquentiellement dans chaque zone visuelle à effet stéréoscopique à vues multiples présentent respectivement une correspondance biunivoque avec une pluralité de vues disposées séquentiellement dans l'écran d'affichage à vues multiples.
PCT/CN2022/089464 2021-04-30 2022-04-27 Procédé, appareil et dispositif de traitement d'affichage basé sur le suivi de l'œil humain, et support de stockage WO2022228451A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202280003327.7A CN115398889A (zh) 2021-04-30 2022-04-27 基于人眼跟踪的显示处理方法、装置、设备及存储介质

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110483471.2 2021-04-30
CN202110483471 2021-04-30

Publications (1)

Publication Number Publication Date
WO2022228451A1 true WO2022228451A1 (fr) 2022-11-03

Family

ID=83847764

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/089464 WO2022228451A1 (fr) 2021-04-30 2022-04-27 Procédé, appareil et dispositif de traitement d'affichage basé sur le suivi de l'œil humain, et support de stockage

Country Status (2)

Country Link
CN (1) CN115398889A (fr)
WO (1) WO2022228451A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6593957B1 (en) * 1998-09-02 2003-07-15 Massachusetts Institute Of Technology Multiple-viewer auto-stereoscopic display systems
CN102497563A (zh) * 2011-12-02 2012-06-13 深圳超多维光电子有限公司 跟踪式裸眼立体显示控制方法、显示控制装置和显示系统
CN102572484A (zh) * 2012-01-20 2012-07-11 深圳超多维光电子有限公司 立体显示控制方法、立体显示控制装置和立体显示系统
CN104137538A (zh) * 2011-12-23 2014-11-05 韩国科学技术研究院 可应用于多个观察者的用于使用动态视区扩展来显示多视点3d图像的装置及其方法
CN107885325A (zh) * 2017-10-23 2018-04-06 上海玮舟微电子科技有限公司 一种基于人眼跟踪的裸眼3d显示方法及控制系统

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103018915B (zh) * 2012-12-10 2016-02-03 Tcl集团股份有限公司 一种基于人眼追踪的3d集成成像显示方法及集成成像3d显示器
CN108174182A (zh) * 2017-12-30 2018-06-15 上海易维视科技股份有限公司 三维跟踪式裸眼立体显示视区调整方法及显示系统

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6593957B1 (en) * 1998-09-02 2003-07-15 Massachusetts Institute Of Technology Multiple-viewer auto-stereoscopic display systems
CN102497563A (zh) * 2011-12-02 2012-06-13 深圳超多维光电子有限公司 跟踪式裸眼立体显示控制方法、显示控制装置和显示系统
CN104137538A (zh) * 2011-12-23 2014-11-05 韩国科学技术研究院 可应用于多个观察者的用于使用动态视区扩展来显示多视点3d图像的装置及其方法
CN102572484A (zh) * 2012-01-20 2012-07-11 深圳超多维光电子有限公司 立体显示控制方法、立体显示控制装置和立体显示系统
CN107885325A (zh) * 2017-10-23 2018-04-06 上海玮舟微电子科技有限公司 一种基于人眼跟踪的裸眼3d显示方法及控制系统

Also Published As

Publication number Publication date
CN115398889A (zh) 2022-11-25

Similar Documents

Publication Publication Date Title
US20220292790A1 (en) Head-mounted display with pass-through imaging
CN112584080B (zh) 三维遥现终端及方法
KR102415502B1 (ko) 복수의 사용자를 위한 라이트 필드 렌더링 방법 및 장치
US10715791B2 (en) Virtual eyeglass set for viewing actual scene that corrects for different location of lenses than eyes
KR102121389B1 (ko) 무안경 3d 디스플레이 장치 및 그 제어 방법
KR20140038366A (ko) 움직임 패럴랙스를 가지는 3-차원 디스플레이
KR102174258B1 (ko) 무안경 3d 디스플레이 장치 및 그 제어 방법
GB2498184A (en) Interactive autostereoscopic three-dimensional display
US8368690B1 (en) Calibrator for autostereoscopic image display
US10990062B2 (en) Display system
US20180184066A1 (en) Light field retargeting for multi-panel display
KR20160025522A (ko) 위치 감지와 뷰들의 적응 수를 갖는 멀티뷰 3차원 디스플레이 시스템 및 방법
KR101975246B1 (ko) 다시점 영상 디스플레이 장치 및 그 제어 방법
US9167237B2 (en) Method and apparatus for providing 3-dimensional image
JPH09238369A (ja) 3次元像表示装置
CN113079364A (zh) 一种静态对象的立体显示方法、装置、介质及电子设备
US20140071237A1 (en) Image processing device and method thereof, and program
CN115668913A (zh) 现场演出的立体显示方法、装置、介质及系统
US10558056B2 (en) Stereoscopic image display device and stereoscopic image display method
JP2006101224A (ja) 画像生成装置、画像生成方法および画像生成プログラム
JP2006115151A (ja) 立体表示装置
Park et al. 48.2: Light field rendering of multi‐view contents for high density light field 3D display
WO2022228451A1 (fr) Procédé, appareil et dispositif de traitement d'affichage basé sur le suivi de l'œil humain, et support de stockage
JP2012222549A (ja) 映像表示装置および映像表示方法
JP4955718B2 (ja) 立体表示制御装置、立体表示システムおよび立体表示制御方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22794921

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22794921

Country of ref document: EP

Kind code of ref document: A1