CN114567767A - Display device, light field acquisition method, image data transmission method and related equipment - Google Patents

Display device, light field acquisition method, image data transmission method and related equipment Download PDF

Info

Publication number
CN114567767A
CN114567767A CN202210167540.3A CN202210167540A CN114567767A CN 114567767 A CN114567767 A CN 114567767A CN 202210167540 A CN202210167540 A CN 202210167540A CN 114567767 A CN114567767 A CN 114567767A
Authority
CN
China
Prior art keywords
light
camera
image
image information
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210167540.3A
Other languages
Chinese (zh)
Other versions
CN114567767B (en
Inventor
王雷
海晓泉
冯煊
李亚鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BOE Technology Group Co Ltd
Original Assignee
BOE Technology Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BOE Technology Group Co Ltd filed Critical BOE Technology Group Co Ltd
Priority to CN202210167540.3A priority Critical patent/CN114567767B/en
Publication of CN114567767A publication Critical patent/CN114567767A/en
Application granted granted Critical
Publication of CN114567767B publication Critical patent/CN114567767B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09FDISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
    • G09F9/00Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements
    • G09F9/30Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements in which the desired character or characters are formed by combining individual elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/194Transmission of image signals

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

The disclosure provides a display device, a light field acquisition method, an image data transmission method and related equipment. Wherein, display device divide into display area and light field acquisition district, display device includes: the display panel comprises a display structure arranged in the display area and a light selecting structure arranged in the light field acquisition area, and is configured to determine a light transmission area of the light selecting structure according to the position of a viewing user, and the light transmission area is controlled to be in a light transmission state: the backlight module is arranged on the backlight side of the display structure; and the camera array is arranged on the backlight side of the light selecting structure and is configured to start at least one camera corresponding to the light transmitting area to acquire a 3D image so as to obtain 3D image information. The part for collecting the light field can share the same display panel with the display part, the use is convenient, the corresponding light transmission area is selected according to the position of a watching user, the camera corresponding to the light transmission area is started to collect the corresponding 3D image at the visual angle of the watching user, the algorithm synthesis is not needed, and the time delay time is shortened.

Description

Display device, light field acquisition method, image data transmission method and related equipment
Technical Field
The present disclosure relates to the field of display technologies, and in particular, to a display device, a light field acquisition method, an image data transmission method, and a related apparatus.
Background
Because the 3D image has outstanding third dimension, the immersive effect when watching the 3D image is better, and more people like the imaging product of the 3D image.
The existing 3D image acquisition equipment needs to acquire a plurality of images through a plurality of cameras and synthesize the images through an algorithm, but the time consumption of the algorithm synthesis is long, so that the real-time effect of image acquisition is poor.
Disclosure of Invention
In view of the above, the present disclosure is directed to a display device, a light field collecting method, an image data transmission method and a related apparatus.
In view of the above, a first aspect of the present disclosure provides a display device, which provides a display area and an optical field acquisition area, the display device comprising:
the display panel comprises a display structure arranged in the display area and a light selecting structure arranged in the light field acquisition area, and is configured to determine a light transmission area of the light selecting structure according to the position of a viewing user, and control the light transmission area to be in a light transmission state:
the backlight module is arranged on the backlight side of the display structure;
and the camera array is arranged on the backlight side of the light selecting structure and is configured to start at least one camera corresponding to the light transmitting area to acquire a 3D image so as to obtain 3D image information.
In a second aspect of the present disclosure, there is provided a light field collecting method applied to the display device of the first aspect, including:
determining a light transmission area of a light selection structure in a display panel corresponding to the position of a watching user according to the position of the watching user, and controlling the light transmission area to be in a light transmission state;
and controlling at least one camera corresponding to the light transmission area in the camera array to start 3D image acquisition to obtain 3D image information.
In a third aspect of the present disclosure, there is provided an image data transmission method, including:
receiving a 3D image acquisition instruction sent by a watching user side;
receiving the position of the viewing user sent by the position tracking device;
acquiring 3D image information based on the 3D image acquisition instruction and the position of the viewing user using the light field acquisition method of the second aspect;
and sending the 3D image information to the watching user side for 3D display.
In a fourth aspect of the present disclosure, there is provided an image data transmission apparatus including:
the receiving module is configured to receive a 3D image acquisition instruction sent by a viewing user side and receive the position of the viewing user sent by the position tracking device;
a 3D imaging module configured to acquire 3D image information based on the 3D image acquisition instruction and the position of the viewing user using the light field acquisition method of any one of claims 5 to 8;
a sending module configured to send the 3D image information to the viewing user terminal for 3D display.
In a fifth aspect of the disclosure, a computer device is provided, comprising one or more processors, memory; and one or more programs, wherein the one or more programs are stored in the memory and executed by the one or more processors, the programs comprising instructions for performing the method of the second or third aspect.
A sixth aspect of the disclosure provides a non-transitory computer readable storage medium containing a computer program which, when executed by one or more processors, causes the processors to perform the method of the second or third aspect.
A seventh aspect of the present disclosure provides a computer program product comprising computer program instructions which, when run on a computer, cause the computer to perform the method of the second or third aspect.
As can be seen from the above, according to the display device, the light field acquisition method, the image data transmission method and the related apparatus provided by the present disclosure, the portion for performing light field acquisition and the display portion can share the same display panel, so that the use is more convenient, and the overall structure is more beautiful. When carrying out the light field and gathering, can be according to watching corresponding printing opacity region of user's position determination, let this printing opacity region be in the printing opacity state to open the camera that this printing opacity region corresponds, make image light can be through printing opacity regional transmission to the camera on, and then carry out 3D image acquisition through the camera, then send the 3D image of gathering to watching the user side and show. Can adjust the printing opacity region like this according to watching user's position, the camera that sets up on display device can be with watching user's visual angle collection corresponding 3D image, need not algorithm synthesis, when guaranteeing 3D image quality, can reduce the time delay time, has guaranteed the real-time of 3D image transmission.
Drawings
In order to clearly illustrate the technical solutions of the present disclosure or related technologies, the drawings used in the embodiments or related technologies description will be briefly introduced below, and obviously, the drawings in the following description are only embodiments of the present disclosure, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1A is a schematic diagram showing an application scenario in which two users conduct a video call in the related art;
fig. 1B is a schematic diagram illustrating image acquisition by controlling different cameras in the related art;
fig. 1C is a diagram illustrating viewpoint difference synthesis in the related art;
fig. 2A shows a schematic structural diagram of a display device of an embodiment of the present disclosure;
FIG. 2B shows a schematic distribution diagram of a display area and a light field acquisition area of a display device of an embodiment of the present disclosure;
FIG. 2C shows a schematic view of a selection of light transmissive regions of an embodiment of the present disclosure;
fig. 2D shows a schematic arrangement of a camera array of an embodiment of the disclosure;
FIG. 2E shows a schematic structural diagram of a light transmission component of an embodiment of the present disclosure;
fig. 3A illustrates a flow chart of a light field acquisition method of a display device of an embodiment of the present disclosure;
FIG. 3B shows a schematic of two camera acquisition pixels at field 0;
FIG. 3C shows a schematic diagram of two camera acquisition pixels at 1 field of view;
FIG. 3D shows a schematic of two cameras and corresponding apertures for the light transmissive regions;
FIG. 3E shows a schematic diagram of summing pixels captured by two cameras;
FIG. 3F shows a schematic diagram of one camera capturing pixels;
FIG. 4 shows a flow chart of an image data transmission method of an embodiment of the present disclosure;
fig. 5 shows a block diagram of the configuration of an image data transmission apparatus of the embodiment of the present disclosure;
fig. 6 shows a hardware configuration diagram of a computer device according to an embodiment of the present disclosure.
Description of reference numerals: 200. a display device;
210. the liquid crystal display comprises a display panel, a display structure 210a, a display structure 210b, a light selecting structure 211, a first polarizer 212, a first substrate 2121, a color film layer 213, a liquid crystal layer 214, a second substrate 2141, a light transmitting component 21411, a transparent pixel electrode 21412, a driving unit 215 and a second polarizer;
220. a backlight module;
230. an array of cameras;
500. an image data transmission device;
510. a receiving module, 520, a 3D imaging module, 530, and a transmitting module.
Detailed Description
For the purpose of promoting a better understanding of the objects, aspects and advantages of the present disclosure, reference is made to the following detailed description taken in conjunction with the accompanying drawings.
It is to be noted that technical terms or scientific terms used in the embodiments of the present disclosure should have a general meaning as understood by those having ordinary skill in the art to which the present disclosure belongs, unless otherwise defined. The use of "first," "second," and similar terms in the embodiments of the disclosure is not intended to indicate any order, quantity, or importance, but rather to distinguish one element from another. The word "comprising" or "comprises", and the like, means that the element or item listed before the word covers the element or item listed after the word and its equivalents, but does not exclude other elements or items. The terms "connected" or "coupled" and the like are not restricted to physical or mechanical connections, but may include electrical connections, whether direct or indirect.
The acquisition process of the position of the viewing user described in each of the embodiments described below is obtained on the premise that the viewing user confirms that the acquisition is permitted. During specific implementation, a popup window for 'whether to allow the position to be acquired' is popped up on the terminal equipment used by the watching user, the user clicks and confirms the popup window, the position of the watching user is acquired, or the position which can be allowed to be acquired is preset by the watching user through the terminal equipment, and then when the position of the watching user is acquired, a corresponding dialog box is not popped up.
In addition, the 3D image acquisition described in the following embodiments is also performed after the user of the display device authorizes the acquisition and transmission of the 3D image.
In the related art, as shown in fig. 1A, it is mainly composed of 2 3D display ports (a display screen a and a display screen B in fig. 1A) equipped with a light field camera array (a light field camera array a and a light field camera array B in fig. 1A), respectively. The user A can see the 3D image of the user B in front of the display screen A, the user B can see the 3D image of the user A in front of the display screen B, and the 3D image signals of the 2 users are respectively acquired in real time by the light field camera array equipped with the other display port.
The camera used for 3D image acquisition is an RGBD (Red Green Blue Depth) camera, and can acquire Depth information of an image. The method comprises the steps of collecting an image with depth information through an RGBD camera, carrying out 3D modeling on the shot image, transmitting the modeled 3D image model to another port, and displaying after arranging the image.
In this manner, when the user B wants to view the 3D image of the user a, as shown in fig. 1B, the position of the user B with respect to the user a is random and continuous, but the number of camera arrays is limited, and the position is discrete and fixed, so it is necessary to determine, according to the position of the user B, to capture an image using 2 cameras closest to the position in the apparatus a, and then perform viewpoint synthesis. As shown in fig. 1C, the position of the user B corresponds to the position of the dotted line camera of the device a, but the position has no physical camera, so the device B performs the viewpoint interpolation algorithm to synthesize the images captured by the camera n and the camera n +1 before displaying the images.
Problems with this are: the viewpoint interpolation image is synthesized by using a viewpoint interpolation algorithm, so the image quality is not good as that of the image directly shot at the position, and the viewpoint interpolation algorithm needs longer time and has poor real-time performance.
In view of the above problem, an embodiment of the present disclosure provides a display device 200, as shown in fig. 2A, which is configured with a display area and a light field acquisition area, where the display device 200 includes:
a display panel 210, including a display structure 210a disposed in the display region and a light selection structure 210b disposed in the light field collection region, configured to determine a light transmission region (a pixel opening region as shown in fig. 2C) of the light selection structure 210b according to a position of a viewing user, and control the light transmission region to be in a light transmission state: a backlight module 220 disposed at a backlight side of the display structure 210 a; a camera array 230 disposed on the backlight side of the light selecting structure 210b and configured to activate at least one camera corresponding to the light transmissive region for 3D image acquisition.
In specific implementation, when 3D image acquisition is not performed, the light field acquisition region is in an opaque dark state, and a user can view corresponding picture playing contents through the display region. When 3D image acquisition is needed, a watching user can obtain a request of a 3D image through terminal equipment, and the request can carry the position of the watching user. Thus, the display device 200 can open the light-transmitting region corresponding to the position of the viewing user in the light selecting structure 210b according to the position of the viewing user, so that light can be transmitted through the light-transmitting region; and controls at least one camera corresponding to the light-transmitting area in the camera array 230 to start 3D image acquisition, to obtain 3D image information, and sends the 3D image information to the terminal device of the viewing user, so that the viewing user can view the 3D image. If the position of the viewing user moves, the display device 200 adjusts the corresponding light-transmitting area and the activated camera in real time.
Wherein the light field collecting section may be disposed at an upper portion or a lower portion of the display device 200, and a specific disposition position is not particularly limited herein (as shown in fig. 2B, the light field collecting region is disposed at an upper portion of the display device 200). The terminal device of the viewing user is a terminal device capable of 3D image display.
Through the scheme, the part for acquiring the light field can share the same display panel 210 with the display part, so that the use is more convenient, and the integral structure is more attractive. When carrying out the light field and gathering, can be according to watching corresponding printing opacity region of user's position determination, let this printing opacity region be in the printing opacity state to open the camera that this printing opacity region corresponds, make image light can be through printing opacity regional transmission to the camera on, and then carry out 3D image acquisition through the camera, then send the 3D image of gathering to watching the user side and show. Can adjust the printing opacity region like this according to watching user's position, the camera that sets up on display device 200 can be with watching user's visual angle collection corresponding 3D image, need not algorithm synthesis, when guaranteeing 3D image quality, can reduce the time delay time, has guaranteed the real-time of 3D image transmission.
In some embodiments, the Display panel 210 is an LCD (Liquid Crystal Display) panel. As shown in fig. 2A, the display panel 210 includes:
a first substrate 212 including a color film layer 2121 with a front projection in the display area; a liquid crystal layer 213; the second substrate 214, which comprises a plurality of light transmissive components 2141 with orthographic projections on the light field collection region, is configured to determine a light transmissive region of the light field collection region according to the position of the viewing user, and to open at least one light transmissive component 2141 corresponding to the light transmissive region. In addition, the display panel 210 further includes: a first polarizer 211 disposed on a light-facing side of the first substrate; and a second polarizer 215 arranged on the backlight side of the light field acquisition region in the orthographic projection of the second substrate 214.
In practical implementation, the first polarizer 211, the first substrate 212, the liquid crystal layer 213, the second substrate 214, and the second polarizer 215 of the display panel 210 are sequentially disposed along the light incident direction. The first substrate 212 and the second substrate 214 are transparent substrates, and the present embodiment is preferably a glass substrate.
In the display area, the color film layer 2121 is disposed on the corresponding first substrate 212, and the backlight module 220 is disposed behind the second polarizer 215, so as to ensure that the picture content can be normally played in the display area.
In the light field collection region, the corresponding first substrate 212 has no color film layer 2121, so that the first substrate 212 of the light field collection region is transparent in a pure color, and thus the light transmission effect can be ensured. And a plurality of light-transmitting assemblies 2141 are disposed in the light field collecting region of the second substrate 214, such that the light-transmitting assemblies 2141 can be turned off when not operating, and the liquid crystal layer 213 corresponding to the light-transmitting assemblies 2141 can keep an opaque state, and when light transmission is required, the corresponding light-transmitting assemblies 2141 are controlled to be turned on, and the light transmittance of the corresponding liquid crystal layer 213 is adjusted and controlled by the light-transmitting assemblies 2141, such that light can be transmitted through the liquid crystal layer 213 corresponding to the light-transmitting assemblies 2141. The camera array 230 is disposed on the backlight side of the second polarizer 215, so as to ensure that the camera array 230 can be used for light field collection to form a 3D image.
Through the scheme, the light field acquisition part and the display part can share the same display panel 210, so that a user can realize two functions of displaying and light field acquisition in one display device 200, and the use is convenient.
In some embodiments, as shown in fig. 2E, the light transmissive member 2141 includes:
a transparent pixel electrode 21411; a driving unit 21412 electrically connected to the transparent pixel electrode 21411 and configured to provide an electrical signal to the transparent pixel electrode 21411 to change the light transmittance of the liquid crystal layer 213 corresponding to the transparent pixel electrode 21411.
In a specific implementation, the transparent pixel electrode 21411 is made of a transparent metal, such as ITO (Indium tin oxide, Indium tin oxide transparent conductive glass film), and is uniformly and densely arranged on the light field collecting portion of the second substrate 214. After the light-transmitting area is determined, the driving unit 21412 corresponding to the light-transmitting area is controlled to start, and the driving unit 21412 is used for providing a high-level electrical signal to the transparent pixel electrode 21411, so that the transparent pixel electrode 21411 is in a high-voltage state, and further, an electric field of the liquid crystal layer 213 is changed, so that the light transmittance of the liquid crystal layer 213 is increased, and further, the light-transmitting area can be in a light-transmitting state. Thereby realizing the purpose that the light-transmitting area formed by the light-selecting structure 210b is controllable.
In some embodiments, as shown in fig. 2D, the camera array 230 is a linear array of cameras 230.
In a specific implementation, the length of the linearly arranged camera array 230 corresponds to the width of the light field collecting region, and the adjacent cameras in the camera array 230 are arranged closely, and the number of the cameras arranged in the corresponding camera array 230 may be adjusted according to the size of the cameras and the width of the light field collecting region, which is not limited herein.
In some embodiments, each camera in the camera array 230 is an RGB (Red, Green, Blue) camera.
In specific implementation, a traditional camera for collecting a 3D image is an RGBD camera, which can collect depth information of an image, and the collected image needs to be modeled based on the depth information to generate the 3D image. The display device 200 of the present embodiment uses a common RGB camera, which is less costly and the process of 3D image acquisition is simpler and faster.
Based on the same inventive concept, the present embodiment provides a light field acquisition method applied to the display device described in the above embodiments.
As shown in fig. 3A, the implementation process of the light field acquisition method includes:
step 301, determining a light transmission region of a light selection structure in a display panel corresponding to a position of a viewing user according to the position of the viewing user, and controlling the light transmission region to be in a light transmission state.
Step 302, controlling at least one camera corresponding to the light-transmitting area in the camera array to start 3D image acquisition, so as to obtain 3D image information.
In some embodiments, step 302 is divided into two cases:
first, as shown in fig. 3B and 3C:
step 3021, in response to determining that the light-transmissive region is located between the light-entrance apertures of two cameras in the camera array, treating the two cameras as a first and second target camera (e.g., first and second target cameras s +1 in fig. 3B and 3C), respectively.
Step 3022, controlling the first target camera to acquire first image information, controlling the second target camera to acquire second image information, and adding the first image information and the second image information to form 3D image information.
In particular implementations, the field of view captured by the two cameras is divided into a 0 field of view for vertical light intake (e.g., as shown in FIG. 3B) and a 1 field of view for oblique light intake (e.g., as shown in FIG. 3C). In 3D image capture, it is basically most cases that 3D image capture is performed by two cameras, and pixels of the two captured images (e.g., the photosensitive pixel s _0 and the photosensitive pixel s +1_0 of the field 0 in fig. 3B, and the photosensitive pixel s _1 and the photosensitive pixel s +1_1 of the field 1 in fig. 3C) are added to form a final 3D image. The imaging process is carried out on the display device, and the two images can be obtained by pixel addition without complex algorithm (such as viewpoint difference synthesis algorithm), so that the time delay can be reduced, and the real-time property of 3D image acquisition is further ensured.
FIG. 3D shows a schematic of two cameras and corresponding apertures for the light transmissive regions; fig. 3E shows a schematic diagram of summing pixels captured by two cameras.
In some embodiments, the adding the first image information and the second image information in step 3022 to form 3D image information specifically includes:
at step 30221, the aperture of the light-transmissive region is determined (e.g., C0 in fig. 3D and 3E).
Step 30222, adding the pixel points in the first image information (e.g., Fn 'and En' in fig. 3D and 3E) and the pixel points in the second image information (e.g., Fn +1 'and En + 1' in fig. 3D and 3E) according to the aperture of the light-transmissive region and the geometric relationship between the first target camera (e.g., Cn in fig. 3D and 3E) and the second target camera (e.g., Cn +1 in fig. 3D and 3E) to obtain a sum result.
Step 30223, filling the summation result to the pixel point imaged corresponding to the aperture of the light-transmitting area to form the 3D image information.
In specific implementation, the light-transmitting area can be set to a corresponding shape according to actual needs, and is preferably a circular area. Through the scheme, the images collected by the two cameras are added through the geometric relation, the addition algorithm is simple, the first image information and the second image information are added to the pixel points of the aperture corresponding imaging of the light-transmitting area, the 3D image information obtained in the mode is clear, and the image distortion can not occur.
Second, as shown in fig. 3F:
step 3021', in response to determining that the optically transmissive region matches an optical entrance aperture of one camera in the array of cameras, treating the one camera as a target camera.
Step 3022', controlling the target camera to perform 3D image acquisition in the light entrance aperture, so as to obtain 3D image information.
During specific implementation, the condition that the light entrance aperture of the camera is just matched with the light transmission area is relatively less, and for the condition, only 3D image acquisition is needed by using the camera, and corresponding 3D image information can be directly obtained without image addition.
In some embodiments, since the user intuitively feels from both eyes, the positions of both eyes of the viewing user are directly taken as the positions of the viewing user. This ensures that the user has a more realistic 3D visual experience.
Through the embodiment, the light transmission area is adjusted according to the position of the watching user, the camera arranged on the display device can acquire the corresponding 3D image according to the visual angle of the watching user, algorithm synthesis is not needed, the 3D image quality is guaranteed, meanwhile, the time delay time can be shortened, and the real-time performance of 3D image transmission is guaranteed.
It should be noted that the above describes some embodiments of the disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments described above and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
Based on the same inventive concept, the embodiment provides an image data transmission method, which can be specifically applied to the process of video monitoring by a single party and the process of video call by two parties.
In the unilateral video monitoring process, the watching user is the monitoring party, and the execution subject of the method is a terminal device provided with a display device (for example, a television, a mobile phone, a tablet, a computer and the like capable of 3D image acquisition).
In the case where the watching users are both (for example, user a or user B) during the video call, the execution bodies of the following methods are terminal devices (for example, terminal a or terminal B) provided with display devices, which are used by both of the users. That is, terminal a transmits the acquired 3D image to terminal B for display, and terminal B transmits the acquired 3D image to terminal a for display.
As shown in fig. 4, the image data transmission method is performed by:
step 401, receiving a 3D image acquisition instruction sent by a viewing user.
At step 402, the position of the viewing user from the position tracking device is received.
Step 403, acquiring 3D image information based on the 3D image acquisition instruction and the position of the viewing user by using the light field acquisition method described in the above embodiment. The light field collecting process of the 3D image information is the same as the above embodiment, and is not described herein again.
Step 404, sending the 3D image information to the viewing user side for 3D display.
In a specific implementation, the 3D image capturing instruction may be the monitoring request or the video call request. The corresponding position tracking device is also arranged at the side of the watching user, and can collect the position of the watching user and send the position of the watching user to the execution main body (for example, the terminal device) of the method of the embodiment.
The tracking device may be integrated with the viewing user terminal, or may be separately configured, but in order to ensure that the position of the viewing user obtained by the tracking device is relatively accurate, the tracking device and the viewing user terminal need to be placed at a position close to each other. In addition, the 3D image acquisition instruction may be sent together with the position of the viewing user by combining, or the 3D image acquisition instruction may be sent first and then the position of the viewing user.
In some embodiments, step 402 comprises: the positions of the eyes of a viewing user from an eye tracking device are received. Corresponding to the ability to determine the position of the eyes of the viewing user is an eye tracking device. Preferably, the eye tracking device is provided on the viewing user side. The positions of the two eyes are determined according to the eyeballs of the two eyes of the watching user, and if the watching user is a single eye, the single-eye position acquired by the eyeball tracking device is used as the position of the two eyes.
Through the scheme of above-mentioned embodiment, can be according to watching user's position and confirm corresponding printing opacity region, let this printing opacity region be in the printing opacity state, and open the camera that this printing opacity region corresponds, make image light can pass through on the printing opacity region transmits to the camera, and then carry out 3D image acquisition through the camera, 3D imaging's process is all accomplished on the terminal equipment who carries out 3D image acquisition, it can only to watch the user side to need to carry out 3D image 3D demonstration, need not to carry out algorithm processing, the time delay time has been reduced, real-time effect has been guaranteed.
It should be noted that the method of the embodiments of the present disclosure may be executed by a single device, such as a computer or a server. The method of the embodiment can also be applied to a distributed scene and completed by the mutual cooperation of a plurality of devices. In such a distributed scenario, one of the multiple devices may only perform one or more steps of the method of the embodiments of the present disclosure, and the multiple devices interact with each other to complete the method.
It should be noted that the above describes some embodiments of the disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments described above and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
The embodiment of the disclosure also provides an image data transmission device, which can be used for 3D image acquisition.
As shown in fig. 5, the image data transmission apparatus 500 includes:
a receiving module 510 configured to receive a 3D image acquisition instruction sent by a viewing user side, and receive a location of the viewing user sent by a location tracking device;
a 3D imaging module 520 configured to acquire 3D image information based on the 3D image acquisition instruction and the position of the viewing user by using the light field acquisition method described in the above embodiment;
a sending module 530 configured to send the 3D image information to the viewing user terminal for 3D display.
In some embodiments, the receiving module 510 is further configured to receive the positions of the eyes of the viewing user from the eye tracking device.
For convenience of description, the above devices are described as being divided into various modules by functions, and are described separately. Of course, the functionality of the various modules may be implemented in the same one or more pieces of software and/or hardware in the practice of the present application.
The apparatus of the foregoing embodiment is used to implement the corresponding image data transmission method in any of the foregoing embodiments, and has the beneficial effects of the corresponding image data transmission method embodiment, which are not described herein again.
Based on the same inventive concept, corresponding to the method of any of the above embodiments, the present disclosure further provides a computer device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and when the processor executes the program, the method of any of the above embodiments is implemented.
Fig. 6 shows a hardware structure diagram of an exemplary computer device 600 provided by the embodiment of the disclosure. The apparatus 600 may include: a processor 602, a memory 604, an input/output interface 606, a communication interface 608, and a bus 610. Wherein the processor 602, memory 604, input/output interface 606, and communication interface 608 enable communication connections within the device with each other via a bus 610.
The processor 602 may be implemented by a general-purpose CPU (Central Processing Unit), a microprocessor, an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits, and is configured to execute related programs to implement the technical solutions provided in the embodiments of the present specification.
The Memory 604 may be implemented in the form of a ROM (Read Only Memory), a RAM (Random Access Memory), a static storage device, a dynamic storage device, or the like. The memory 604 may store an operating system and other application programs, and when the technical solution provided by the embodiments of the present specification is implemented by software or firmware, the relevant program codes are stored in the memory 604 and called by the processor 602 for execution.
The input/output interface 606 is used for connecting an input/output module to realize information input and output. The i/o module may be configured as a component in a device (not shown) or may be external to the device to provide a corresponding function. The input devices may include a keyboard, a mouse, a touch screen, a microphone, various sensors, etc., and the output devices may include a display, a speaker, a vibrator, an indicator light, etc.
The communication interface 608 is used for connecting a communication module (not shown in the figure) to implement communication interaction between the present device and other devices. The communication module can realize communication in a wired mode (such as USB, network cable and the like) and also can realize communication in a wireless mode (such as mobile network, WIFI, Bluetooth and the like).
Bus 610 includes a path that transfers information between various components of the device, such as processor 602, memory 604, input/output interface 606, and communication interface 608.
It should be noted that although the device 600 is shown with only the processor 602, the memory 604, the input/output interface 606, the communication interface 608, and the bus 610, in a specific implementation, the device 600 may also include other components necessary for normal operation. Furthermore, those skilled in the art will appreciate that the apparatus 600 described above may also include only the components necessary to implement the embodiments of the present description, and not necessarily all of the components shown in the figures.
The computer device 600 of the foregoing embodiment is used to implement the corresponding method in any of the foregoing embodiments, and has the beneficial effects of the corresponding method embodiment, which are not described herein again.
Based on the same inventive concept, the present disclosure also provides a non-transitory computer-readable storage medium storing computer instructions for causing the computer to perform the method according to any of the above embodiments, corresponding to any of the above-described embodiment methods.
Computer-readable media of the present embodiments, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device.
The computer instructions stored in the storage medium of the above embodiment are used to enable the computer to execute the method according to any of the above embodiments, and have the beneficial effects of the corresponding method embodiment, and are not described herein again.
Based on the same inventive concept, the present disclosure also provides a computer program product, corresponding to any of the above-described embodiment methods, comprising a computer program. In some embodiments, the computer program is executable by one or more processors to cause the processors to perform the method. Corresponding to the execution subject corresponding to each step in the embodiments of the method, the processor executing the corresponding step may be the corresponding execution subject.
The computer program product of the foregoing embodiment is used to enable a processor to execute the method according to any of the foregoing embodiments, and has the beneficial effects of the corresponding method embodiment, which are not described herein again.
Those of ordinary skill in the art will understand that: the discussion of any embodiment above is meant to be exemplary only, and is not intended to intimate that the scope of the disclosure, including the claims, is limited to these examples; within the idea of the present disclosure, also technical features in the above embodiments or in different embodiments may be combined, steps may be implemented in any order, and there are many other variations of the different aspects of the embodiments of the present disclosure as described above, which are not provided in detail for the sake of brevity.
In addition, well-known power/ground connections to Integrated Circuit (IC) chips and other components may or may not be shown in the provided figures for simplicity of illustration and discussion, and so as not to obscure the embodiments of the disclosure. Furthermore, devices may be shown in block diagram form in order to avoid obscuring embodiments of the present disclosure, and this also takes into account the fact that specifics with respect to implementation of such block diagram devices are highly dependent upon the platform within which the embodiments of the present disclosure are to be implemented (i.e., specifics should be well within purview of one skilled in the art). Where specific details (e.g., circuits) are set forth in order to describe example embodiments of the disclosure, it should be apparent to one skilled in the art that the embodiments of the disclosure can be practiced without, or with variation of, these specific details. Accordingly, the description is to be regarded as illustrative instead of restrictive.
While the present disclosure has been described in conjunction with specific embodiments thereof, many alternatives, modifications, and variations of these embodiments will be apparent to those of ordinary skill in the art in light of the foregoing description. For example, other memory architectures (e.g., dynamic ram (dram)) may use the discussed embodiments.
The disclosed embodiments are intended to embrace all such alternatives, modifications and variances which fall within the broad scope of the appended claims. Therefore, any omissions, modifications, equivalents, improvements, and the like that may be made within the spirit and principles of the embodiments of the disclosure are intended to be included within the scope of the disclosure.

Claims (15)

1. A display device comprising a display area and a light field acquisition area, the display device comprising:
the display panel comprises a display structure arranged in the display area and a light selecting structure arranged in the light field acquisition area, and is configured to determine a light transmission area of the light selecting structure according to the position of a viewing user, and control the light transmission area to be in a light transmission state:
the backlight module is arranged on the backlight side of the display structure;
and the camera array is arranged on the backlight side of the light selecting structure and is configured to start at least one camera corresponding to the light transmitting area to acquire a 3D image so as to obtain 3D image information.
2. The display device according to claim 1, wherein the display panel comprises:
the first substrate comprises a color film layer with orthographic projection positioned in the display area;
a liquid crystal layer;
a second substrate comprising a plurality of light-transmissive components orthographically positioned at the light field collection region, configured to determine a light-transmissive region of the light field collection region according to a position of a viewing user, and to open at least one light-transmissive component corresponding to the light-transmissive region.
3. The display device according to claim 2, wherein the light-transmitting member comprises:
a transparent pixel electrode;
and the driving unit is electrically connected with the transparent pixel electrode and is configured to provide an electric signal to the transparent pixel electrode so as to change the light transmittance of the liquid crystal layer corresponding to the transparent pixel electrode.
4. The display device of claim 1, wherein each camera in the array of cameras is an RGB camera.
5. A light field acquisition method applied to the display device of any one of claims 1 to 4, comprising:
determining a light transmission area of a light selection structure in a display panel corresponding to the position of a watching user according to the position of the watching user, and controlling the light transmission area to be in a light transmission state;
and controlling at least one camera corresponding to the light transmission area in the camera array to start 3D image acquisition to obtain 3D image information.
6. The light field acquisition method according to claim 5, wherein said controlling at least one camera corresponding to said transparent region in the camera array to initiate 3D image acquisition, resulting in 3D image information, comprises:
responsive to determining that the light-transmissive region is located between the light entry apertures of two cameras in the camera array, treating the two cameras as a first target camera and a second target camera, respectively;
and controlling the first target camera to acquire first image information, controlling the second target camera to acquire second image information, and adding the first image information and the second image information to form 3D image information.
7. The light field acquisition method according to claim 6, wherein said summing the first image information and the second image information to form 3D image information comprises:
determining an aperture of the light-transmitting area;
adding the pixel points in the first image information and the pixel points in the second image information according to the aperture of the light-transmitting area and the geometric relationship between the first target camera and the second target camera to obtain an addition result;
and filling the addition result to the pixel point of the aperture corresponding imaging of the light-transmitting area to form the 3D image information.
8. The light field acquisition method according to claim 5, wherein said controlling at least one camera corresponding to said transparent region in the camera array to initiate 3D image acquisition, resulting in 3D image information, comprises:
in response to determining that the optically transmissive region matches an optical entrance aperture of one camera in the array of cameras, treating the one camera as a target camera;
and controlling the target camera to acquire a 3D image in the light entrance aperture to obtain 3D image information.
9. The light field acquisition method according to any one of claims 5 to 8, wherein the position of the viewing user is a position of both eyes of the viewing user.
10. An image data transmission method comprising:
receiving a 3D image acquisition instruction sent by a watching user side;
receiving the position of the viewing user sent by the position tracking device;
acquiring 3D image information based on the 3D image acquisition instruction and the position of the viewing user using the light field acquisition method of any one of claims 5 to 8;
and sending the 3D image information to the watching user side for 3D display.
11. The image data transmission method according to claim 10, wherein the receiving location tracking device tracks the location of the viewing user from the device, including:
the positions of the eyes of a viewing user from an eye tracking device are received.
12. An image data transmission apparatus comprising:
the receiving module is configured to receive a 3D image acquisition instruction sent by a viewing user terminal and receive the position of the viewing user sent by the position tracking device;
a 3D imaging module configured to acquire 3D image information based on the 3D image acquisition instruction and the position of the viewing user using the light field acquisition method of any one of claims 5 to 9;
a sending module configured to send the 3D image information to the viewing user terminal for 3D display.
13. A computer device comprising one or more processors, memory; and one or more programs, wherein the one or more programs are stored in the memory and executed by the one or more processors, the programs comprising instructions for performing the method of any of claims 5-11.
14. A non-transitory computer-readable storage medium containing a computer program which, when executed by one or more processors, causes the processors to perform the method of any one of claims 5 to 11.
15. A computer program product comprising computer program instructions which, when run on a computer, cause the computer to perform the method of any of claims 5 to 11.
CN202210167540.3A 2022-02-23 2022-02-23 Display device, light field acquisition method, image data transmission method and related equipment Active CN114567767B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210167540.3A CN114567767B (en) 2022-02-23 2022-02-23 Display device, light field acquisition method, image data transmission method and related equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210167540.3A CN114567767B (en) 2022-02-23 2022-02-23 Display device, light field acquisition method, image data transmission method and related equipment

Publications (2)

Publication Number Publication Date
CN114567767A true CN114567767A (en) 2022-05-31
CN114567767B CN114567767B (en) 2024-06-18

Family

ID=81714629

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210167540.3A Active CN114567767B (en) 2022-02-23 2022-02-23 Display device, light field acquisition method, image data transmission method and related equipment

Country Status (1)

Country Link
CN (1) CN114567767B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024088004A1 (en) * 2022-10-28 2024-05-02 京东方科技集团股份有限公司 Display device and image acquisition method therefor

Citations (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007251778A (en) * 2006-03-17 2007-09-27 Saxa Inc Image input-output apparatus
US20080043100A1 (en) * 2006-07-31 2008-02-21 Irwin Sobel Projection screen and camera array
CN102385181A (en) * 2010-09-06 2012-03-21 联想(北京)有限公司 Liquid crystal display screen as well as control method thereof and terminal equipment including display screen
CN103019021A (en) * 2012-12-27 2013-04-03 Tcl集团股份有限公司 3D (three-dimensional) light field camera and method for processing images shot by same
CN103702099A (en) * 2013-12-17 2014-04-02 四川大学 Ultra-large visual-angle integrated-imaging 3D(Three-Dimensional)displaying method based on head tracking
CN104272371A (en) * 2012-04-08 2015-01-07 三星电子株式会社 Transparent display apparatus and method thereof
CN105763862A (en) * 2014-09-03 2016-07-13 乐金显示有限公司 Image display apparatus
CN105763865A (en) * 2016-02-26 2016-07-13 北京邮电大学 Naked eye 3D augmented reality method and device based on transparent liquid crystals
US20170019627A1 (en) * 2015-07-14 2017-01-19 Google Inc. Immersive teleconferencing with translucent video stream
CN106375694A (en) * 2015-12-31 2017-02-01 北京智谷睿拓技术服务有限公司 Light field display control method and device, and light field display equipment
CN107105216A (en) * 2017-06-02 2017-08-29 北京航空航天大学 A kind of continuous parallax based on pinhole array, the 3 d light fields display device of wide viewing angle
CN108513123A (en) * 2017-12-06 2018-09-07 中国人民解放军陆军装甲兵学院 A kind of pattern matrix generation method that integration imaging light field is shown
CN108965771A (en) * 2018-09-29 2018-12-07 京东方科技集团股份有限公司 A kind of video calling control device and method, video call system
CN110024002A (en) * 2016-12-30 2019-07-16 谷歌有限责任公司 For using contribution region to carry out the system and method for reconstruction of optical wave field
CN110324553A (en) * 2018-03-28 2019-10-11 北京富纳特创新科技有限公司 Outdoor scene window system based on video communication
CN210091609U (en) * 2019-06-26 2020-02-18 深圳市光鉴科技有限公司 Display device with 3D camera module and electronic equipment
CN110928075A (en) * 2019-12-17 2020-03-27 京东方科技集团股份有限公司 Display device, manufacturing method, and display and camera shooting method
CN111158161A (en) * 2019-12-26 2020-05-15 西安电子科技大学 Integrated imaging light field information acquisition and display method
CN111726498A (en) * 2020-06-23 2020-09-29 Oppo广东移动通信有限公司 Electronic equipment
CN111771374A (en) * 2019-01-14 2020-10-13 京东方科技集团股份有限公司 Display device, electronic apparatus, and method of driving display device
CN111818324A (en) * 2020-05-26 2020-10-23 北京邮电大学 Device and method for generating three-dimensional large-visual-angle light field
US20210021785A1 (en) * 2019-07-18 2021-01-21 Microsoft Technology Licensing, Llc Light field camera modules and light field camera module arrays
CN112584080A (en) * 2016-09-09 2021-03-30 谷歌有限责任公司 Three-dimensional telepresence terminal and method

Patent Citations (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007251778A (en) * 2006-03-17 2007-09-27 Saxa Inc Image input-output apparatus
US20080043100A1 (en) * 2006-07-31 2008-02-21 Irwin Sobel Projection screen and camera array
CN102385181A (en) * 2010-09-06 2012-03-21 联想(北京)有限公司 Liquid crystal display screen as well as control method thereof and terminal equipment including display screen
CN104272371A (en) * 2012-04-08 2015-01-07 三星电子株式会社 Transparent display apparatus and method thereof
CN103019021A (en) * 2012-12-27 2013-04-03 Tcl集团股份有限公司 3D (three-dimensional) light field camera and method for processing images shot by same
CN103702099A (en) * 2013-12-17 2014-04-02 四川大学 Ultra-large visual-angle integrated-imaging 3D(Three-Dimensional)displaying method based on head tracking
CN105763862A (en) * 2014-09-03 2016-07-13 乐金显示有限公司 Image display apparatus
US20170019627A1 (en) * 2015-07-14 2017-01-19 Google Inc. Immersive teleconferencing with translucent video stream
CN106375694A (en) * 2015-12-31 2017-02-01 北京智谷睿拓技术服务有限公司 Light field display control method and device, and light field display equipment
CN105763865A (en) * 2016-02-26 2016-07-13 北京邮电大学 Naked eye 3D augmented reality method and device based on transparent liquid crystals
CN112584080A (en) * 2016-09-09 2021-03-30 谷歌有限责任公司 Three-dimensional telepresence terminal and method
CN110024002A (en) * 2016-12-30 2019-07-16 谷歌有限责任公司 For using contribution region to carry out the system and method for reconstruction of optical wave field
CN107105216A (en) * 2017-06-02 2017-08-29 北京航空航天大学 A kind of continuous parallax based on pinhole array, the 3 d light fields display device of wide viewing angle
CN108513123A (en) * 2017-12-06 2018-09-07 中国人民解放军陆军装甲兵学院 A kind of pattern matrix generation method that integration imaging light field is shown
CN110324553A (en) * 2018-03-28 2019-10-11 北京富纳特创新科技有限公司 Outdoor scene window system based on video communication
CN108965771A (en) * 2018-09-29 2018-12-07 京东方科技集团股份有限公司 A kind of video calling control device and method, video call system
CN111771374A (en) * 2019-01-14 2020-10-13 京东方科技集团股份有限公司 Display device, electronic apparatus, and method of driving display device
CN210091609U (en) * 2019-06-26 2020-02-18 深圳市光鉴科技有限公司 Display device with 3D camera module and electronic equipment
US20210021785A1 (en) * 2019-07-18 2021-01-21 Microsoft Technology Licensing, Llc Light field camera modules and light field camera module arrays
CN110928075A (en) * 2019-12-17 2020-03-27 京东方科技集团股份有限公司 Display device, manufacturing method, and display and camera shooting method
CN111158161A (en) * 2019-12-26 2020-05-15 西安电子科技大学 Integrated imaging light field information acquisition and display method
CN111818324A (en) * 2020-05-26 2020-10-23 北京邮电大学 Device and method for generating three-dimensional large-visual-angle light field
CN111726498A (en) * 2020-06-23 2020-09-29 Oppo广东移动通信有限公司 Electronic equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024088004A1 (en) * 2022-10-28 2024-05-02 京东方科技集团股份有限公司 Display device and image acquisition method therefor

Also Published As

Publication number Publication date
CN114567767B (en) 2024-06-18

Similar Documents

Publication Publication Date Title
CN105100609B (en) The adjusting method of mobile terminal and acquisition parameters
RU2725673C1 (en) Display module, a display device, a control method for the display device and a data medium
AU2021314277B2 (en) Interaction method and apparatus, and electronic device and computer-readable storage medium
CN112165632B (en) Video processing method, device and equipment
TWI551889B (en) Display device, package box, and package device
CN103493480A (en) Video conferencing display device
US20230360184A1 (en) Image processing method and apparatus, and electronic device and computer-readable storage medium
US20240134512A1 (en) Interaction method and apparatus, electronic device, and storage medium
US20220159197A1 (en) Image special effect processing method and apparatus, and electronic device and computer readable storage medium
CN114598823B (en) Special effect video generation method and device, electronic equipment and storage medium
WO2023226814A1 (en) Video processing method and apparatus, electronic device, and storage medium
US20230405475A1 (en) Shooting method, apparatus, device and medium based on virtual reality space
CN114567767B (en) Display device, light field acquisition method, image data transmission method and related equipment
US9818331B2 (en) Image flipping processing method and system
CN117319725A (en) Subtitle display method, device, equipment and medium
CN115002442A (en) Image display method and device, electronic equipment and storage medium
US20150268748A1 (en) Interactive control and display method and system
CN103108169A (en) Mobile terminal which display video frame rate in real time and achieving method thereof
CN202841396U (en) Digital film optimization device and digital film projection system
CN113676690A (en) Method, device and storage medium for realizing video conference
CN113452813B (en) Image acquisition device, terminal device, method, processing device, and medium
CN111221444A (en) Split screen special effect processing method and device, electronic equipment and storage medium
US20240153159A1 (en) Method, apparatus, electronic device and storage medium for controlling based on extended reality
US20230410721A1 (en) Illumination portions on displays
TWI552066B (en) Method and device for controlling electronic display via package box

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant