CN109194942B - Naked eye 3D video playing method, terminal and server - Google Patents

Naked eye 3D video playing method, terminal and server Download PDF

Info

Publication number
CN109194942B
CN109194942B CN201811344932.2A CN201811344932A CN109194942B CN 109194942 B CN109194942 B CN 109194942B CN 201811344932 A CN201811344932 A CN 201811344932A CN 109194942 B CN109194942 B CN 109194942B
Authority
CN
China
Prior art keywords
picture
grating
server
video
rendering
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811344932.2A
Other languages
Chinese (zh)
Other versions
CN109194942A (en
Inventor
陈春豪
陆小松
张涛
蒲天发
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Thredim Photoelectric Co ltd
Original Assignee
Jiangsu Thredim Photoelectric Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu Thredim Photoelectric Co ltd filed Critical Jiangsu Thredim Photoelectric Co ltd
Priority to CN201811344932.2A priority Critical patent/CN109194942B/en
Publication of CN109194942A publication Critical patent/CN109194942A/en
Application granted granted Critical
Publication of CN109194942B publication Critical patent/CN109194942B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

The invention provides a naked eye 3D video playing method, a terminal and a server, and relates to the technical field of naked eye 3D video playing. The naked eye 3D video playing method comprises the following steps: the client exports a picture carrying 3D screen grating parameters; the client sends the pictures to the server; the server obtains a picture carrying 3D screen grating parameters sent by the client, 3D rendering is carried out on the video source based on the picture, a video file is obtained by re-recording in the 3D rendering process, and then the video file is sent to the client to be played. The server carries out 3D rendering processing on the video source according to the screen raster parameters carried by the picture, and records the video source again to obtain a new video piece, so that the matching performance of the video source and the 3D screen is improved, and the performance requirement of a naked eye 3D technology on the 3D screen is reduced.

Description

Naked eye 3D video playing method, terminal and server
Technical Field
The invention relates to the technical field of naked eye 3D video playing, in particular to a naked eye 3D video playing method, a terminal and a server.
Background
In the current era, with the rapid development of science and technology, the application of 3D stereoscopic display technology is becoming more and more widespread. Especially, the naked eye 3D technology enables people to get rid of the constraint of special glasses, and the application range and the use comfort degree of the 3D technology are greatly improved.
Currently, the implementation of naked eye 3D video playing is to select a video source first, then perform 3D rendering, and then directly play the video on a terminal device.
However, the 3D screens of different terminals are different in size and model, and the screen raster parameters are also different. The realization of bore hole 3D technique requires higher to terminal equipment's performance, especially when the video source is 4k video source, requires higher to terminal equipment's performance, and the broadcast effect of video source is not good, consequently influences bore hole 3D's user experience. The high performance requirement on the terminal equipment also causes that the naked eye 3D technology cannot be widely applied.
Disclosure of Invention
The present invention provides a naked eye 3D video playing method and a terminal, aiming at the above disadvantages in the prior art.
In order to achieve the above purpose, the embodiment of the present invention adopts the following technical solutions:
in a first aspect, an embodiment of the present invention provides a naked eye 3D video playing method, applied to a server, including: the method comprises the steps that a server obtains a picture which is sent by a client and carries 3D screen grating parameters, and the size of the picture corresponds to the size of a 3D screen;
the server performs 3D rendering on a video source based on the picture, and performs re-recording in the 3D rendering process to obtain a video file;
and the server sends the video file to a client for playing.
Further, the step of the server performing 3D rendering on the video source based on the picture includes:
and the server displays the image pixels with different parallaxes in the video source based on the position of each point pixel point in the picture.
Further, the step of performing, by the server, 3D rendering on a video source based on the picture includes:
and the server performs 3D rendering on the same video source according to different raster parameters of the picture, wherein the image results are different.
In a second aspect, an embodiment of the present invention further provides a naked eye 3D video playing method, applied to a client, including:
the method comprises the steps that a client exports a picture carrying 3D screen grating parameters, and the size of the picture corresponds to the size of a 3D screen;
the client sends the picture to a server, wherein the picture is used for the server to perform 3D rendering on a video source based on the picture, and a video file is obtained after the picture is recorded again;
and the client receives the video file sent by the server for playing.
Further, the images carry different raster parameters according to different types and sizes of the 3D screens.
In a third aspect, an embodiment of the present invention further provides a naked eye 3D video playing server, which is applied to a server, where the server includes:
the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring a picture which is sent by a client and carries 3D screen grating parameters, and the size of the picture corresponds to the size of a 3D screen;
the processing module is used for performing 3D rendering on a video source based on the picture and re-recording in the 3D rendering process to obtain a video file;
and the first communication module is used for sending the video file to a client for playing.
Further, the processing module is configured to perform 3D rendering on a video source based on the picture.
Specifically, the processing module is configured to display image pixels with different parallaxes in a video source based on the position of each point pixel point in the picture.
Further, the processing module is configured to perform 3D rendering on a video source based on the picture, and perform re-recording in the 3D rendering process to obtain a video file, and includes:
and the processing module is used for performing different image results of 3D rendering on the same video source based on different grating parameters of the picture.
In a fourth aspect, an embodiment of the present invention further provides a naked eye 3D video playing terminal, which is applied to a client, and includes:
the exporting module is used for exporting the picture carrying the 3D screen grating parameters, and the size of the picture corresponds to the size of the 3D screen;
the second communication module is used for sending the picture to a server, wherein the picture is used for the server to perform 3D rendering on a video source based on the picture, and a video file is obtained after the picture is recorded again;
and the playing module is used for receiving the video file sent by the server and playing the video file.
Further, the images carry different raster parameters according to different types and sizes of the 3D screens.
The invention has the beneficial effects that:
the embodiment of the invention provides a naked eye 3D video playing method, a server and a terminal, wherein a picture with the same size as a screen is exported through a client, and the picture carries raster parameter information of a 3D screen. And the server performs 3D rendering processing on the video source according to the screen raster parameters carried by the picture, captures the image from the video source, corresponds each image pixel to the physical position corresponding to the picture, re-records the image to obtain a new video file, and sends the new video file to the client for playing. The video source is processed according to the grating parameters of the 3D screen, so that the effect of playing the video source by the 3D screens of different types and sizes is better, and the performance requirement of a naked eye 3D technology on the 3D screen is reduced.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
Fig. 1 is a flowchart of a server work in a naked eye 3D video playing method according to an embodiment of the present invention;
fig. 2 is a flowchart of a client in a naked eye 3D video playing method according to an embodiment of the present invention;
fig. 3 is a schematic block diagram of a server for playing a naked eye 3D video according to an embodiment of the present invention;
fig. 4 is a schematic block diagram of a naked eye 3D video playing terminal according to an embodiment of the present invention.
Icon: 103-a first communication module; 104-a processing module; 105-an acquisition module; 106-a derivation module; 107-a second communication module; 108-Play Module.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention.
First embodiment
Fig. 1 is a work flow diagram of a server in a naked eye 3D video playing method provided by an embodiment of the present invention, the embodiment of the present invention provides a naked eye 3D video playing method applied to a server, the server may be a server, a computer, a cloud disk, and other devices, the server may establish a communication connection with a terminal playing a naked eye 3D video through communication methods such as the internet, a 4G/5G mobile communication network, and the internet of things, the method includes:
s101: and the server acquires a picture carrying 3D screen raster parameters sent by the client, wherein the size of the picture corresponds to the size of the 3D screen.
Specifically, a client for naked eye 3D video playing, that is, a terminal for playing video, may be a hardware device such as a smart phone, a handset, a tablet computer, a notebook, a computer, and a television, and these devices are correspondingly provided with APPs (applications) for playing naked eye 3D video. According to the difference of terminals playing naked-eye 3D videos, the terminals have different types and different sizes of 3D screens, and the grating characteristics, types, specifications, thicknesses and sizes of the 3D screens are also different, so that the 3D screens of different terminals have different image effects, and different grating parameters of the 3D screens are stored in the client.
Specifically, the grating parameters, types, specifications, thicknesses and sizes of different types and sizes of 3D screens are different. The grating parameters specifically refer to grating density, grating thickness, grating viewing distance, grating light transmittance, grating deviation value, and the like.
The grating image effects of different screens are different, the client stores grating parameters of different 3D screens, the grating parameters are made into a mapping table, and a picture with the same size as the screen is put in the mapping table.
The image is used as a carrier, raster parameter information of a screen can be recorded more conveniently, a server can correspond pixel points in a video source to physical positions on the image after receiving the image, and the video source is rendered more conveniently. The size of the picture is also representative of the size of the 3D screen selected.
S102: and 3D rendering is carried out on the video source by the server based on the picture, and the video file is obtained by re-recording in the 3D rendering process.
Specifically, after acquiring the picture carrying the 3D screen raster parameter, the server may render the naked-eye 3D video source to be played based on the picture.
It should be noted that the picture is equivalent to a table made according to raster parameters of the screen, and when rendering, the video source is split into one frame and one frame of images, and then each pixel on each frame of images is corresponding to a corresponding cell of the table, so as to complete rendering.
Specifically, different pixels have different parallaxes, which is a difference in direction resulting from viewing the same object from two points having a certain distance. And mapping according to different parallaxes of different image pixels in the rendering process, corresponding each image pixel to the grating parameter of the screen, and displaying the corresponding image pixel at the corresponding physical position of the picture.
And a new video file needs to be obtained by re-recording in the rendering process.
It should be noted that, recording a video file requires acquiring each frame of RGB image through opengl (open graphics library), and simultaneously converting the RGB image into YUV (a color coding method adopted by the european television system) and then performing video coding, wherein a specific process of converting one frame of RGB image into YUV is as follows:
both RGB and YUV are color spaces used to represent colors, and both may be converted to each other. The RGB color scheme is a color standard, and various colors are obtained by changing three color channels of red, green and blue and superimposing the three color channels on each other. Wherein R represents red, G represents green, and B represents blue.
YUV is mainly used to optimize the transmission of color video signals, and has the greatest advantage over RGB video signal transmission in that it takes up very little bandwidth (RGB requires the simultaneous transmission of three separate video signals). Wherein "Y" represents a gray scale value, which is a baseband signal; "U" and "V" denote chromaticity, which is used to describe the color and saturation of the image for specifying the color of the pixel. U and V are not baseband signals and U, V are quadrature modulated during processing. The RGB input signals create luminance by superimposing specific parts of the RGB signals together; the chrominance and saturation are represented by Cr and Cb, respectively, with Cr reflecting the difference between the red part of the RGB input signal and the luminance value of the RGB signal. And Cb reflects the difference between the blue portion of the RGB input signal and the same luminance value of the RGB input signal. Through operation, the mutual conversion between YUV and RGB can be realized, and the process is as follows:
Y'=0.299*R'+0.587*G'+0.114*B'
U'=-0.147*R'-0.289*G'+0.436*B'=0.492*(B'-Y')
V'=0.615*R'-0.515*G'-0.100*B'=0.877*(R'-Y')
R'=Y'+1.140*V'
G'=Y'-0.394*U'-0.581*V'
B'=Y'+2.032*U'
the YUV three components can be reduced to R (red), G (green) and B (blue), and the process is as follows:
Y'=0.257*R'+0.504*G'+0.098*B'+16
Cb'=-0.148*R'-0.291*G'+0.439*B'+128
Cr'=0.439*R'-0.368*G'-0.071*B'+128
R'=1.164*(Y'-16)+1.596*(Cr'-128)
G'=1.164*(Y'-16)-0.813*(Cr'-128)-0.392*(Cb'-128)
B'=1.164*(Y'-16)+2.017*(Cb'-128)
the symbols are provided with prime marks, which indicate that the symbols are gamma corrected on the basis of the original values.
In the process of recording the video, the RGB images captured one frame by one frame in the video source are converted into YUV through the conversion process, and then the video is encoded and recorded into a video file with a conventional format.
In the process of recording the video file, the RGB image is converted into YUV and then video coding is carried out to obtain a new video file, and the new video file is selected from conventional formats for storage such as AVI, WMV, MPEG, MKV, RMVB, MOV and the like.
S103: and the server sends the video file to the client for playing.
Specifically, the server can establish communication connection with a terminal playing naked eye 3D videos through communication modes such as the Internet, a 4G/5G mobile communication network and the Internet of things, and sends the video files to the client by using an FTP application layer protocol.
The invention has the beneficial effects that:
the embodiment of the invention provides a naked eye 3D video playing method which is applied to a server side. And exporting a picture with the same size as the screen through the client, wherein the picture carries the grating parameter information of the 3D screen. And the server performs 3D rendering processing on the video source according to the screen raster parameters carried by the picture, captures the image from the video source, corresponds each image pixel to the physical position corresponding to the picture, re-records the image to obtain a new video file, and sends the new video file to the client for playing. The video source is processed according to the grating parameters of the 3D screen, so that the effect of playing the video source by the 3D screens of different types and sizes is better, and the performance requirement of a naked eye 3D technology on the 3D screen is reduced.
In one embodiment, the server performs a 3D rendering step on the video source based on the picture, including:
s201: and the server displays the image pixels with different parallaxes in the video source based on the position of each point pixel point in the picture.
It should be noted that the server needs to read the corresponding raster parameters in the picture first, and then captures the image from the video source, where the process of capturing the image is to divide the video source into frames of images, capture each frame of image, and read the positions of the pixel points on each frame of image, and this process may be implemented by some languages or tools, such as Opencv, Matlab, Python, C languages, and the like. And then, displaying the grating parameter information of each point in the picture in one-to-one correspondence with the pixel points in the video source.
It should be noted that, in the process of capturing images from a video source, it is necessary to map the pixels of each frame of image in the video source conveniently according to the length, the number of frames, the number of sampling pixels, and other information of the video source.
This process, which requires grabbing a frame-by-frame image from a video source, may need to be implemented by some function, such as cvGrabFrame, etc.
It should be noted that the process of displaying the pixel of the image to the corresponding physical position of the picture needs to be completed by a program. The pixels in the image are accessed and if necessary, the assignment is modified through different functions, and the functions used include a cvGet2D () function, a cvSet × D function and the like.
In one embodiment, the server performs a 3D rendering step on the video source based on the picture, including:
s301: the server performs 3D rendering on the same video source based on different raster parameters of the picture, and the image results are different.
It should be noted that the pictures carry raster parameters of the 3D screen, and the corresponding pictures of different screens are different. Because different types and sizes of 3D screens have different characteristics, types, specifications, thicknesses and sizes, the grating density, the grating thickness, the grating visual distance, the grating light transmittance, the grating deviation value and the like of the grating are different.
Second embodiment
Fig. 2 is a schematic diagram of a client operating procedure according to an embodiment of the present invention; fig. 4 is a schematic module diagram of a naked eye 3D video playing terminal according to an embodiment of the present invention; a naked eye 3D video playing method is applied to a client and comprises the following steps:
s401: and the client exports a picture carrying 3D screen raster parameters, and the size of the picture corresponds to the size of the 3D screen.
Specifically, the client has different grating parameters according to different screen types and different screen sizes for playing the naked eye 3D video, but the client can acquire grating parameters of a terminal for playing the naked eye 3D video and derive a picture carrying grating parameter information of the 3D screen, and the size of the picture is equal to that of the 3D screen so as to record the grating parameter information of the 3D screen.
It should be noted that, a storage system is provided in the client 102, and the grating parameters of 3D screens of different types and models can be manually input: grating density, grating thickness, grating viewing distance, grating light transmittance, grating deviation value and the like.
S402: and the client sends the picture to the server, and the picture is used for the server to perform 3D rendering on the video source based on the picture, and the video file is obtained after the picture is recorded again.
Specifically, when sending a picture to the server through internet communication, the client needs to comply with communication protocols such as TCP, HTTP, UDP, and the like. Optionally, the client and the server may communicate through a socket, where the socket is a communication pipe established to implement the above communication process, the processes of the two parties communicate through the socket, and a specified protocol is adopted as a rule of the communication.
Specifically, the grating parameters, types, specifications, thicknesses and sizes of different types and sizes of 3D screens are different. The grating parameters specifically refer to grating density, grating thickness, grating viewing distance, grating light transmittance, grating deviation value, and the like.
The grating image effects of different screens are different, the client stores grating parameters of different 3D screens, the grating parameters are made into a mapping table, and a picture with the same size as the screen is put in the mapping table.
The image is used as a carrier, raster parameter information of a screen can be recorded more conveniently, a server can correspond pixel points in a video source to physical positions on the image after receiving the image, and the video source is rendered more conveniently. The size of the picture is also representative of the size of the 3D screen selected.
The video file after re-recording adopts conventional video suitable format, and the conventional video format includes AVI, WMV, MPEG, MKV, RMVB, MOV and the like. The parameters of the video after re-recording are changed slightly corresponding to the pictures, so that the video is matched with the 3D screen better, and the playing effect is better.
The client receives the video processed by the server and starts playing the video only when receiving an operation instruction of the user.
The images carry different grating parameters according to different types and sizes of the 3D screens.
It should be noted that the common 3D screen sizes include 82 inches, 46 inches, 24 inches, 21.5 inches, and so on. In addition to different screen sizes, screen raster parameters of common 3D screens are also different. The grating parameters specifically refer to grating density, grating thickness, grating viewing distance, grating light transmittance, grating deviation value, and the like.
The invention has the beneficial effects that:
the embodiment of the invention provides a naked eye 3D video playing method, a server and a terminal, which are applied to a client. And exporting a picture with the same size as the screen through the client, wherein the picture carries the grating parameter information of the 3D screen. And the server performs 3D rendering processing on the video source according to the screen raster parameters carried by the picture, captures the image from the video source, corresponds each image pixel to the physical position corresponding to the picture, re-records the image to obtain a new video file, and sends the new video file to the client for playing. The video source is processed according to the grating parameters of the 3D screen, so that the effect of playing the video source by the 3D screens of different types and sizes is better, and the performance requirement of a naked eye 3D technology on the 3D screen is reduced.
Third embodiment
Fig. 3 is a schematic block diagram of a server for playing a naked eye 3D video according to an embodiment of the present invention;
the utility model provides a bore hole 3D video broadcast server, is applied to the server side, includes:
the acquiring module 105 is used for acquiring a picture which is sent by a client and carries 3D screen raster parameters, and the size of the picture corresponds to the size of a 3D screen;
the processing module 104 is configured to perform 3D rendering on a video source based on the picture, and perform re-recording in the 3D rendering process to obtain a video file;
the first communication module 103 is configured to send the video file to the client for playing.
The above-mentioned apparatus is used for executing the method provided by the foregoing embodiment, and the implementation principle and technical effect are similar, which are not described herein again.
The processing module 104 is configured to perform 3D rendering on a video source based on a picture, and includes:
and the processing module 104 is configured to display image pixels with different parallaxes in the video source based on the position of each point pixel point in the picture.
The above-mentioned apparatus is used for executing the method provided by the foregoing embodiment, and the implementation principle and technical effect are similar, which are not described herein again.
The processing module 104 is configured to perform 3D rendering on a video source based on a picture, and perform re-recording in a 3D rendering process to obtain a video file, and includes:
and the processing module 104 is configured to perform different image results for 3D rendering on the same video source based on different raster parameters of the picture.
The above-mentioned apparatus is used for executing the method provided by the foregoing embodiment, and the implementation principle and technical effect are similar, which are not described herein again.
The invention has the beneficial effects that:
the embodiment of the invention provides a naked eye 3D video playing server which is applied to a server side, the server performs 3D rendering processing on a video source according to a picture which is derived from the client side, has the same size as a screen, carries raster parameter information of the 3D screen, captures an image from the video source, corresponds each image pixel to a physical position corresponding to the picture, records again to obtain a new video file, and sends the new video file to the client side 102 for playing. The video source is processed according to the grating parameters of the 3D screen, so that the effect of playing the video source by the 3D screens of different types and sizes is better, and the performance requirement of a naked eye 3D technology on the 3D screen is reduced.
Fourth embodiment
Fig. 4 is a schematic block diagram of a naked eye 3D video playing terminal according to an embodiment of the present invention;
the utility model provides a bore hole 3D video playback terminal, is applied to the client, includes:
the exporting module 106 is used for exporting the picture carrying the 3D screen raster parameters, and the size of the picture corresponds to the size of the 3D screen;
the second communication module 107 is configured to send the picture to the server, where the picture is used for the server to perform 3D rendering on the video source based on the picture, and obtain a video file after re-recording;
and the playing module 108 is configured to receive the video file sent by the server and play the video file.
The above-mentioned apparatus is used for executing the method provided by the foregoing embodiment, and the implementation principle and technical effect are similar, which are not described herein again.
The images carry different grating parameters according to different types and sizes of the 3D screens.
The invention has the beneficial effects that:
the embodiment of the invention provides a naked eye 3D video playing terminal which is applied to a client, wherein the client exports a picture with the same size as a screen, and the picture carries raster parameter information of a 3D screen. The client sends the picture to the server. And the server performs 3D rendering processing on the video source according to the screen raster parameters carried by the picture, captures the image from the video source, corresponds each image pixel to the physical position corresponding to the picture, re-records the image to obtain a new video file, and sends the new video file to the client for playing. The video source is processed according to the grating parameters of the 3D screen, so that the effect of playing the video source by the 3D screens of different types and sizes is better, and the performance requirement of a naked eye 3D technology on the 3D screen is reduced.
Fifth embodiment
The server may include a memory and a processor that may invoke a computer program in the memory. When the computer program is read and run by a processor, the server method embodiment described above may be implemented. The specific implementation and technical effects are similar, and are not described herein again.
Optionally, the present invention further provides a computer-readable storage medium, on which a computer program is stored, and when the computer program is read and executed by a processor, the above-mentioned method embodiments can be implemented.
The client may include a memory and a processor that may invoke a computer program in the memory. When the computer program is read and run by a processor, the server method embodiment described above may be implemented. The specific implementation and technical effects are similar, and are not described herein again.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The integrated unit implemented in the form of a software functional unit may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium and includes several instructions to enable a computer device (which may be a personal computer, a server, or a network device) or a processor (processor) to execute some steps of the methods according to the embodiments of the present invention. And the aforementioned storage medium includes: a U disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
It is noted that, in this document, relational terms such as "first" and "second," and the like, may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention. It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.

Claims (8)

1. A naked eye 3D video playing method is applied to a server side and is characterized by comprising the following steps:
the method comprises the steps that a server obtains a picture which is sent by a client and carries 3D screen grating parameters, and the size of the picture corresponds to the size of a 3D screen; the grating parameters include: grating density, grating thickness, grating visual distance, grating light transmittance and grating deviation value;
the server performs 3D rendering on a video source based on the picture, and performs re-recording in the 3D rendering process to obtain a video file;
the server sends the video file to a client for playing;
wherein the server performing 3D rendering of a video source based on the picture comprises: and the server displays the image pixels with different parallaxes in the video source based on the position of each point pixel point in the picture.
2. The method according to claim 1, wherein the server performs a 3D rendering step on a video source based on the picture, comprising:
and the server performs 3D rendering on the same video source according to different raster parameters of the picture, wherein the image results are different.
3. A naked eye 3D video playing method is applied to a client side and is characterized by comprising the following steps:
the method comprises the steps that a client exports a picture carrying 3D screen grating parameters, and the size of the picture corresponds to the size of a 3D screen; the grating parameters include: grating density, grating thickness, grating visual distance, grating light transmittance and grating deviation value;
the client sends the picture to a server, wherein the picture is used for the server to perform 3D rendering on a video source based on the picture, and a video file is obtained after the picture is recorded again;
the client receives the video file sent by the server and plays the video file;
wherein the server performing 3D rendering on the video source based on the picture comprises: and the server displays the image pixels with different parallaxes in the video source based on the position of each point pixel point in the picture.
4. The method according to claim 3, wherein the pictures carry different raster parameters according to the type and size of the 3D screen.
5. The utility model provides a bore hole 3D video broadcast server, is applied to the server side, its characterized in that, the server includes:
the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring a picture which is sent by a client and carries 3D screen grating parameters, and the size of the picture corresponds to the size of a 3D screen; the grating parameters include: grating density, grating thickness, grating visual distance, grating light transmittance and grating deviation value;
the processing module is used for performing 3D rendering on a video source based on the picture and re-recording in the 3D rendering process to obtain a video file; wherein the 3D rendering of the video source based on the picture comprises: the server displays image pixels with different parallaxes in a video source based on the position of each point pixel point in the picture;
and the first communication module is used for sending the video file to a client for playing.
6. The server according to claim 5, wherein the processing module is configured to perform 3D rendering on a video source based on the picture, and perform re-recording in the 3D rendering process to obtain a video file, and the processing module is configured to:
and the processing module is used for performing different image results of 3D rendering on the same video source based on different grating parameters of the picture.
7. The utility model provides a bore hole 3D video playback terminal, is applied to the client, its characterized in that includes:
the exporting module is used for exporting the picture carrying the 3D screen grating parameters, and the size of the picture corresponds to the size of the 3D screen; the grating parameters include: grating density, grating thickness, grating visual distance, grating light transmittance and grating deviation value;
the second communication module is used for sending the picture to a server, wherein the picture is used for the server to perform 3D rendering on a video source based on the picture, and a video file is obtained after the picture is recorded again; wherein the server performing 3D rendering on the video source based on the picture comprises: the server displays image pixels with different parallaxes in a video source based on the position of each point pixel point in the picture;
and the playing module is used for receiving the video file sent by the server and playing the video file.
8. The terminal according to claim 7, wherein the pictures carry different raster parameters according to different types and sizes of the 3D screens.
CN201811344932.2A 2018-11-13 2018-11-13 Naked eye 3D video playing method, terminal and server Active CN109194942B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811344932.2A CN109194942B (en) 2018-11-13 2018-11-13 Naked eye 3D video playing method, terminal and server

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811344932.2A CN109194942B (en) 2018-11-13 2018-11-13 Naked eye 3D video playing method, terminal and server

Publications (2)

Publication Number Publication Date
CN109194942A CN109194942A (en) 2019-01-11
CN109194942B true CN109194942B (en) 2020-08-11

Family

ID=64939453

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811344932.2A Active CN109194942B (en) 2018-11-13 2018-11-13 Naked eye 3D video playing method, terminal and server

Country Status (1)

Country Link
CN (1) CN109194942B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110336994B (en) * 2019-07-04 2021-06-22 上海索倍信息科技有限公司 Naked eye 3D display system
CN110913202B (en) * 2019-11-26 2022-01-07 深圳英伦科技股份有限公司 Three-dimensional display cloud rendering method and system
CN113612981A (en) * 2021-08-25 2021-11-05 福建天晴数码有限公司 Video-based 3D (three-dimensional) graph real-time rendering method and system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102143166A (en) * 2011-01-25 2011-08-03 天津大学 Multi-view-point stereoscopic mobile phone video processing method based on gratings
CN102263977A (en) * 2011-08-01 2011-11-30 清华大学 Stereo video acquisition method and device for mobile terminal
CN103095828A (en) * 2013-01-14 2013-05-08 上海电力学院 Web three dimensional (3D) synchronous conference system based on rendering cloud and method of achieving synchronization
CN108139803A (en) * 2015-10-08 2018-06-08 Pcms控股公司 For the method and system calibrated automatically of dynamic display configuration
CN108156436A (en) * 2017-12-27 2018-06-12 武汉微梦文化科技有限公司 The on-line processing method of three-dimensional promotional videos

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5978695B2 (en) * 2011-05-27 2016-08-24 株式会社Jvcケンウッド Autostereoscopic display device and viewpoint adjustment method
CN104320647B (en) * 2014-10-13 2017-10-10 深圳超多维光电子有限公司 Stereoscopic image generation method and display device
CN104683785B (en) * 2015-02-06 2017-02-22 四川长虹电器股份有限公司 Real-time 3D (3-Dimensional) character inserting and playing method based on naked-eye 3D technology

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102143166A (en) * 2011-01-25 2011-08-03 天津大学 Multi-view-point stereoscopic mobile phone video processing method based on gratings
CN102263977A (en) * 2011-08-01 2011-11-30 清华大学 Stereo video acquisition method and device for mobile terminal
CN103095828A (en) * 2013-01-14 2013-05-08 上海电力学院 Web three dimensional (3D) synchronous conference system based on rendering cloud and method of achieving synchronization
CN108139803A (en) * 2015-10-08 2018-06-08 Pcms控股公司 For the method and system calibrated automatically of dynamic display configuration
CN108156436A (en) * 2017-12-27 2018-06-12 武汉微梦文化科技有限公司 The on-line processing method of three-dimensional promotional videos

Also Published As

Publication number Publication date
CN109194942A (en) 2019-01-11

Similar Documents

Publication Publication Date Title
CN112219398B (en) Method and apparatus for depth coding and decoding
CN109194942B (en) Naked eye 3D video playing method, terminal and server
WO2021004176A1 (en) Image processing method and apparatus
CN110971931A (en) Video watermark adding method and device, electronic equipment and storage medium
CN108141576B (en) Display device and control method thereof
CN107465939B (en) Method and device for processing video image data stream
CN112087648B (en) Image processing method, image processing device, electronic equipment and storage medium
KR20210027482A (en) Methods and apparatus for volumetric video transmission
WO2023016035A1 (en) Video processing method and apparatus, electronic device, and storage medium
WO2021227919A1 (en) Method and device for image data encoding, display method and device, and electronic device
CN113906761A (en) Method and apparatus for encoding and rendering 3D scene using patch
KR20080018396A (en) Computer-readable medium for recording mobile application and personal computer application for displaying display information of mobile communications terminal in external display device
CN107396002B (en) A kind of processing method and mobile terminal of video image
US8599240B2 (en) Super-resolution from 3D (3D to 2D conversion) for high quality 2D playback
WO2023035973A1 (en) Video processing method and apparatus, device, and medium
CN110661880A (en) Remote assistance method, system and storage medium
WO2023016044A1 (en) Video processing method and apparatus, electronic device, and storage medium
CN114245027B (en) Video data hybrid processing method, system, electronic equipment and storage medium
CN112165631B (en) Media resource processing method and device, storage medium and electronic equipment
EP3407297B1 (en) Method and device for determining a characteristic of a display device
TW201026052A (en) Method for transmitting a man-machine operation picture, a mobile video device thereof, and a video system using the same
CN113037947B (en) Method for coding spatial information in continuous dynamic image
WO2023236162A1 (en) Camera module, image processing method and apparatus, terminal, electronic device and medium
WO2022228368A1 (en) Image processing method, device and system
WO2023036111A1 (en) Video processing method and apparatus, device and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant