CN115668913A - Stereoscopic display method, device, medium and system for field performance - Google Patents

Stereoscopic display method, device, medium and system for field performance Download PDF

Info

Publication number
CN115668913A
CN115668913A CN202280003321.XA CN202280003321A CN115668913A CN 115668913 A CN115668913 A CN 115668913A CN 202280003321 A CN202280003321 A CN 202280003321A CN 115668913 A CN115668913 A CN 115668913A
Authority
CN
China
Prior art keywords
shooting
binocular video
display
target
shooting interval
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280003321.XA
Other languages
Chinese (zh)
Inventor
张建伟
杨民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Deep Vision Technology Nanjing Co ltd
Original Assignee
Deep Vision Technology Nanjing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Deep Vision Technology Nanjing Co ltd filed Critical Deep Vision Technology Nanjing Co ltd
Publication of CN115668913A publication Critical patent/CN115668913A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking

Abstract

The embodiment of the application discloses a stereoscopic display method, a stereoscopic display device, a stereoscopic display medium and a stereoscopic display system for field performance. The method comprises the following steps: carrying out binocular video shooting on at least two shooting intervals of a to-be-shot area to obtain binocular video streams of the at least two shooting intervals; storing binocular video streams of at least two shooting intervals, and determining a watching position according to a human eye tracking result; determining a target shooting interval corresponding to the watching position, and displaying binocular video streams obtained based on the target shooting interval in naked eye 3D display equipment; and controlling the naked eye 3D display equipment to display a binocular video stream obtained based on the switched shooting interval in response to the detection of the shooting interval switching event of the watching position.

Description

Stereoscopic display method, device, medium and system for on-site performance
The present application claims priority from chinese patent application having application number 202110670702.0, filed by the chinese patent office on 2021, 06, 17.d, the entire contents of which are incorporated herein by reference.
Technical Field
The embodiment of the application relates to the technical field of machine vision, for example, to a stereoscopic display method, device, medium and system for a live performance.
Background
Live stereoscopic displays are intended to provide a display of an immersive experience, with the scene displayed changing as the position of the user's head changes.
Head mounted displays in the related art support this immersive experience viewing, with such devices, the display presents the user with a scene captured in front of the camera position when the user looks forward; when the user turns completely around, the display presents the user with a scene captured behind the camera position. However, this approach has the disadvantage that the user must wear the head mounted display, which is inconvenient for the user who is wearing the glasses due to his or her myopia.
Disclosure of Invention
The embodiment of the application provides a stereoscopic display method, a stereoscopic display device, a stereoscopic display medium and a stereoscopic display system for a live performance.
In a first aspect, an embodiment of the present application provides a stereoscopic display method for a live performance, where the method includes:
carrying out binocular video shooting on at least two shooting intervals of a to-be-shot area to obtain binocular video streams of the at least two shooting intervals;
storing the binocular video streams of the at least two shooting intervals, and determining the watching position according to the human eye tracking result;
determining a target shooting interval corresponding to the watching position, and displaying a binocular video stream obtained based on the target shooting interval in naked eye 3D (3-dimensional) display equipment;
and controlling the naked eye 3D display equipment to display binocular video streams obtained based on the switched shooting interval in response to the detection of the shooting interval switching event of the watching position.
In a second aspect, an embodiment of the present application provides a stereoscopic display apparatus for a live performance, where the apparatus includes:
the video stream acquisition module is used for carrying out binocular video shooting on at least two shooting intervals of the area to be shot to obtain binocular video streams of the at least two shooting intervals;
the position determining module is used for storing binocular video streams of the at least two shooting intervals and determining a watching position according to a human eye tracking result;
the video display module is configured to determine a target shooting interval corresponding to the watching position and display binocular video streams obtained based on the target shooting interval in naked eye 3D display equipment;
and the video switching module is used for responding to a shooting interval switching event of the detected watching position and controlling the naked eye 3D display equipment to display a binocular video stream obtained based on the switched shooting interval.
In a third aspect, an embodiment of the present application provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements a stereoscopic display method for a live performance according to an embodiment of the present application.
In a fourth aspect, the present embodiment provides a stereoscopic display system for a live performance, including at least two shooting devices, a processing device, a storage device, and a naked-eye 3D display device; the at least two shooting devices, the storage device and the naked eye 3D display device are respectively connected with the processing device, wherein:
the at least two shooting devices are set to carry out binocular video shooting of on-site performance on at least two shooting intervals;
the storage equipment is used for storing the binocular video streams of the at least two shooting intervals;
the naked eye 3D display equipment is set to determine a viewing position according to a human eye tracking result;
the processing equipment is set to determine a target shooting interval corresponding to the watching position and control the naked eye 3D display equipment to display a binocular video stream obtained based on the target shooting interval;
the processing device is further configured to control the naked eye 3D display device to display a binocular video stream obtained based on the switched shooting interval in response to detecting the shooting interval switching event of the viewing position.
Drawings
Fig. 1 is a flowchart of a stereoscopic display method for a live performance according to an embodiment of the present disclosure;
fig. 2 is a schematic view of a shooting scene according to an embodiment of the present application;
fig. 3 is a schematic view of another shooting scene provided in the first embodiment of the present application;
fig. 4 is a schematic diagram of video acquisition and playing provided in an embodiment of the present application;
fig. 5 is a flowchart of a stereoscopic display method for a live performance according to a second embodiment of the present application;
fig. 6 is a schematic structural diagram of a stereoscopic display apparatus for a live performance according to a third embodiment of the present application;
fig. 7 is a block diagram of a stereoscopic display system for a live performance according to a fifth embodiment of the present disclosure.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the application and are not limiting of the application. It should be further noted that, for the convenience of description, only some of the structures associated with the present application are shown in the drawings, not all of them.
Before discussing exemplary embodiments in more detail, it should be noted that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart may describe the steps as a sequential process, many of the steps can be performed in parallel, concurrently or simultaneously. In addition, the order of the steps may be rearranged. The process may be terminated when its operations are completed, but could have additional steps not included in the figure. The processes may correspond to methods, functions, procedures, subroutines, subprograms, and the like.
Example one
Fig. 1 is a flowchart of a stereoscopic display method for a live performance according to an embodiment of the present invention, which is capable of enabling a user to view a stereoscopic stage from various angles in a horizontal direction in front of a screen, and which can be executed by a stereoscopic display device for a live performance according to an embodiment of the present invention, wherein the device can be implemented by software and/or hardware, and can be integrated into a system.
As shown in fig. 1, the stereoscopic display method for a live performance includes:
s110, carrying out binocular video shooting on at least two shooting intervals of the area to be shot to obtain binocular video streams of the at least two shooting intervals.
The shooting section may connect the stage center and one of the at least one shooting device, and an area scanned by the angle range (- α, β) is a shooting section, where α is an angle at which a connection line between the stage center and one of the at least one shooting device is shifted to the left, and β is an angle at which the connection line is shifted to the right. For example, fig. 2 is a schematic diagram of a shooting scene, as shown in fig. 2, the area covered by the angle range (- α, β) connecting the center of the stage and the shooting apparatus a is the shooting range of the shooting apparatus a, that is, the range between two solid lines in fig. 2. Here, α and β may be the same or different, and this embodiment is not limited to this.
In this embodiment, it can be understood that the binocular video is a video obtained by simulating left and right eyes of a human to acquire an image of an object. The binocular video can be shot by a binocular camera or a monocular camera. If the monocular camera is adopted, any two adjacent monocular cameras need to be regarded as a group, the erection mode of the monocular camera is similar to that of the binocular camera, and the selection of the camera types is not limited in the embodiment.
For the implementation mode using a monocular camera, it is necessary to ensure that the connecting line of two adjacent cameras which need to be used as two eyes is perpendicular to the direction from the center of the stage to the midpoint of the connecting line. Under the condition of meeting the requirement, the double-eye camera can be assembled. In addition, if the distance between two adjacent monocular cameras can be close to the distance between human eyes, or the angle formed by the two monocular cameras and the center of the stage is close to the angle formed by the human eyes watching the screen, a better display effect can be obtained. In the scheme, if the monocular cameras are set close enough, the binocular video stream can be obtained by binding two adjacent monocular cameras, and the acquisition equipment forming the binocular video stream can also be in a jumping manner, for example, the first monocular camera and the third monocular camera form a group of binocular cameras, so that the corresponding video stream is generated and displayed in subsequent use. How to select the adjacent or jumping monocular camera can be manually selected by staff, and can be automatically allocated by acquiring the distance between several monocular cameras or the angle formed by the monocular cameras and the center of the stage.
The binocular video stream may be continuous video data that is captured by a binocular camera or a group of monocular cameras and transmitted and played on the network in chronological order.
Illustratively, fig. 3 is a schematic view of another shooting site. As shown in fig. 2 and 3, a series of binocular cameras are arranged in a row at a proper distance in front of the stage, or in a slightly arc shape, the two-lens plane of the binocular cameras is perpendicular to the center point of the stage, and the Field of view (FOV) of the outermost video camera is ensured to be beyond the edge of the stage at the side where the FOV is located, so as to simulate the situation that the audience watches the performance from different angles in the same row in a cinema. Monocular cameras can also be used for the same shooting, but the erection height of each camera needs to be kept consistent. It can be understood that the binocular camera and the monocular camera can be of the same type, so that subsequent calibration and correction are facilitated.
And S120, storing the binocular video streams of the at least two shooting intervals, and determining the watching position according to the human eye tracking result.
The result of eye tracking may be that the eye tracking device obtains coordinate information of the eyes of the viewer and performs conversion of a coordinate system. For example, the eye tracking device may be placed at the center of the edge of the screen, and a spatial rectangular coordinate system a may be established with the eye tracking device as the origin, where the rectangular coordinates of the left and right eyes of the person detected by the eye tracking device are (x) respectively 1 ,y 1 ,z 1 ) And (x) 2 ,y 2 ,z 2 ) Then, the rectangular coordinate of the center of the left and right eyes is (x) 3 ,y 3 ,z 3 ). In addition, another rectangular coordinate system B is established by taking the center of the stage as the origin, and the position coordinate of the audience in front of the stage is (X) 1 ,Y 1 ,Z 1 ) Then there is X 1 =C 1 x 1 ,Y 1 =C 2 y 1 ,Z 1 =C 3 z 1 In which C is 1 ,C 2 ,C 3 Is a constant. The corresponding transformation relationship between the two coordinate systems A and B can be obtained. The eye tracking device may be externally connected to the video playing device, or may be installed in the video playing device, which is not limited in this embodiment.
In this embodiment, the video captured by the camera is processed and then stored in a storage device or transmitted to a back-end device through a computer technology or a communication technology, for example, a computer or a mobile phone used by a viewing user. The back-end equipment receives the transmitted signal, then unpacks and restores at least one path of video stream, and stores the video streams into local storage equipment for selection when the back-end host plays or later playback.
S130, determining a target shooting interval corresponding to the watching position, and displaying binocular video streams obtained based on the target shooting interval in naked eye 3D display equipment.
The target shooting section corresponding to the viewing position may be determined after an angle formed by the viewing position and the naked eye 3D display device is determined, and the shooting section is determined as the target shooting section when the angle is determined to be located in which shooting section. As can be understood, the binocular video stream corresponding to the target shooting interval is the video stream that the user needs to see at the current watching position, so that the binocular video stream corresponding to the target shooting interval can be displayed for the user to watch.
For example, fig. 4 is a schematic diagram of video capture and playback. As shown in fig. 4, the eye tracking apparatus converts the coordinates according to the detected coordinate information of the human eye, determines the viewing position, and transmits the viewing position information to the back-end host system. And the rear-end host system matches the viewing position information with at least one shooting interval, judges which shooting interval the viewer changes the viewing position in, takes the matched shooting interval as a target shooting interval, then takes out the binocular video corresponding to the target shooting interval from the storage equipment in the rear-end host system and controls the naked eye 3D display equipment to play the binocular video for the viewer to watch. The naked-eye 3D display device may be a display device capable of obtaining a realistic stereoscopic image with space and depth without any auxiliary devices (such as 3D glasses, helmets, etc.) by using the parallax of two eyes of a person.
And S140, in response to the shooting interval switching event of the watching position, controlling the naked eye 3D display equipment to display a binocular video stream obtained based on the switched shooting interval.
Wherein the photographing section switching event of the viewing position may be that the viewing position of the viewer moves from one photographing section to another photographing section. Illustratively, when the viewing position of the viewer changes, the eye tracking device detects the change and sends new coordinate information of the eye to the back-end host system, and the back-end host system analyzes and judges that the viewing position of the viewer moves from one shooting area to another shooting area, and then obtains a binocular video stream corresponding to the new shooting area from the storage device and controls the naked eye 3D display device to display the binocular video.
According to the technical scheme provided by the embodiment of the application, binocular video shooting is carried out on at least two shooting intervals of a region to be shot to obtain binocular video streams of the at least two shooting intervals; storing the binocular video streams of the at least two shooting intervals, and determining the watching position according to the human eye tracking result; determining a target shooting interval corresponding to the watching position, and displaying binocular video streams obtained based on the target shooting interval in naked eye 3D display equipment; and controlling the naked eye 3D display equipment to display a binocular video stream obtained based on the switched shooting interval in response to the detection of the shooting interval switching event of the watching position. The embodiment has completed the stereo broadcast to the stage from different angles through above means, has reached and can make spectator just can see lifelike live performance video in real time through bore hole 3D display to the user can select to watch the beneficial effect of live performance at different angles according to own hobby.
According to the scheme, the device for watching the stereoscopic video is simplified, and the beneficial effects that the watching comfort level of a user is improved are achieved.
In one example, the stereoscopic display method for a live performance described in the embodiments of the present application is performed by a stereoscopic display system for a live performance.
Example two
Fig. 5 is a flowchart of a stereoscopic display method for a live performance in the second embodiment of the present application, which is optimized based on the second embodiment. The concrete optimization is as follows: after determining a target shooting interval corresponding to the viewing position and displaying a binocular video stream obtained based on the target shooting interval in a naked eye 3D display device, the method further comprises the following steps: determining a shooting interval adjacent to a target shooting interval corresponding to the watching position; in the process of displaying the binocular video stream obtained based on the target shooting interval, synchronously preloading the binocular video stream obtained based on the adjacent shooting interval; responding to a shooting interval switching event of a detected watching position, and controlling the naked eye 3D display equipment to display a binocular video stream obtained based on a switched shooting interval, wherein the binocular video stream comprises: in response to detecting a photographing section switching event of the viewing position, determining that the target is adjacent to a photographing section; and synchronously switching the pre-loaded binocular video stream obtained based on the target adjacent shooting interval through the naked eye 3D display equipment. Through such processing, smooth switching of binocular video streams when the viewer moves position can be achieved.
As shown in fig. 5, the method of the present embodiment includes the following steps:
and S510, carrying out binocular video shooting on at least two shooting intervals of the area to be shot to obtain binocular video streams of the at least two shooting intervals.
In this embodiment, in an example, before performing binocular video shooting through at least two shooting intervals of a region to be shot, obtaining calibration parameters of the at least two shooting devices is further included, and correspondingly, after obtaining binocular video streams of the at least two shooting intervals, performing calibration parameter compensation processing on binocular video streams obtained by shooting through the at least two shooting devices based on the calibration parameters is further included.
The calibration parameters may be parameters that need to be calibrated in the camera. For example, the calibration parameter may be a vertical parallax existing in the shooting device, where the vertical parallax may be caused by a slight deviation of the placement heights of two cameras of the shooting device in the current shooting interval, so that the two eyes of the shooting device may generate parallax in the vertical direction. The calibration parameter compensation process may be to calibrate the calibration parameters and eliminate errors.
Illustratively, the left image and the right image in the binocular video stream captured by the at least two capturing devices are vertically calibrated based on the vertical parallax; wherein the vertical calibration is to vertically shift the left image and the right image based on the vertical parallax so that the vertical parallax of the left image and the right image is eliminated. The embodiment can process the distortion generated by the image shot by the shooting equipment by acquiring the calibration parameters of the shooting equipment and performing compensation processing on the calibration parameters, thereby obtaining a better image.
In this embodiment, in an example, before obtaining the binocular video streams of the at least two shooting intervals, obtaining shooting parameters of the at least two shooting devices, and correspondingly, after obtaining the binocular video streams of the at least two shooting intervals, the method further includes performing shooting parameter consistency processing on the binocular video streams obtained by shooting by the at least two shooting devices based on the shooting parameters.
The shooting parameters of the shooting device may be, for example, a focal length, a depth of field, and an exposure.
In the embodiment, the shooting parameters of the binocular videos obtained in different shooting areas are adjusted to be consistent, so that the watching experience of the audience on the videos is improved.
S520, storing the binocular video streams of the at least two shooting intervals, and determining the watching position according to the human eye tracking result.
S530, determining a target shooting interval corresponding to the watching position, and displaying binocular video streams obtained based on the target shooting interval in naked eye 3D display equipment.
And S540, determining the adjacent shooting interval of the target shooting interval corresponding to the watching position.
The adjacent shooting intervals of the target shooting interval can be 1 or 2/8230, and the number of the adjacent shooting intervals is not limited by the embodiment. In this embodiment, for example, the method for determining the adjacent shooting interval of the target shooting interval corresponding to the viewing position may be to determine whether the shooting interval is the adjacent shooting interval of the target shooting interval by comparing the linear distances between the other shooting intervals and the target shooting interval, where the number of the adjacent shooting intervals may be fixed or dynamically adjusted according to different scenes.
And S550, synchronously preloading binocular video streams obtained based on the adjacent shooting intervals in the process of displaying the binocular video streams obtained based on the target shooting intervals.
The synchronous preloading can be to download the binocular video stream corresponding to the adjacent shooting interval at the same time when the binocular video stream corresponding to the target shooting interval is displayed.
And S560, in response to detecting the shooting interval switching event of the watching position, determining that the target is adjacent to the shooting interval.
The target adjacent shooting interval may be at least one adjacent shooting interval of the target shooting interval corresponding to the new viewing position after switching.
It can be understood that, when the viewing position of the viewer changes, the back-end host system needs to determine a new target shooting interval according to the new viewing position, and then determine a target adjacent shooting interval of the new target shooting interval.
And S570, synchronously switching the pre-loaded binocular video stream obtained based on the target adjacent shooting interval through the naked eye 3D display equipment.
Illustratively, when the display device plays the binocular video corresponding to the shooting interval a, the backend host system downloads binocular video streams corresponding to the adjacent shooting intervals B and C of the shooting interval a, and when the watching position of the audience moves to the position B, the backend host system controls the display device to switch the video to the binocular video corresponding to the shooting interval B at the same time, and simultaneously loads the binocular video streams corresponding to the target adjacent shooting intervals E and F of the shooting interval B.
In this embodiment, the synchronously switching, by the naked eye 3D display device, the pre-loaded binocular video stream obtained based on the target adjacent shooting interval includes: acquiring a frame blanking period of the naked eye 3D display device; and in response to detecting the shooting interval switching event of the watching position, switching from the current binocular video stream to a binocular video stream obtained based on the target adjacent shooting interval in the frame blanking period. Wherein, the display is bright for a period of time when one frame is played, corresponding to a forward display period; the other brief period is the corresponding blanking period of full black, which is not visible to the human eye due to the extremely short duration. If the video is switched in the state that the forward stroke of the frame is bright, the picture will generate the phenomenon of flicker. In this embodiment, the back-end host system selects to switch the video in the frame blanking period, i.e., in a dark state, so that the viewer cannot perceive the switching of the video, thereby improving the viewing experience of the viewer on the video.
According to the technical scheme provided by the embodiment of the application, the adjacent shooting interval of the target shooting interval corresponding to the watching position is determined; in the process of displaying the binocular video stream obtained based on the target shooting interval, synchronously loading the binocular video stream obtained based on the adjacent shooting interval; correspondingly, in response to detecting a shooting interval switching event of the viewing position, controlling the naked eye 3D display device to display a binocular video stream obtained based on the switched shooting interval, including: in response to detecting a photographing section switching event of the viewing position, determining that the target is adjacent to a photographing section; and synchronously switching the pre-loaded binocular video stream obtained based on the target adjacent shooting interval through the naked eye 3D display equipment. The embodiment completes the synchronous switching of the videos by the means, avoids the blockage and delay in the video stream switching process, and enables the audience to smoothly watch the three-dimensional area to be shot from different angles. In an example, due to the limited viewing angle of the screen, even if the user moves left and right to the maximum angle, the angle of content switching may not cover all angles captured by the camera array, for example 360 degrees captured by the camera array, but video captured in a range of 120 degrees at most may be switched by the cooperation of human eye tracking and the screen. According to the technical scheme, the reference video angle corresponding to the direction opposite to the screen can be switched through manual control, for example, under a normal condition, a picture which is watched by a user opposite to the screen is shot by shooting equipment opposite to the center of the stage in a shooting site, if the user wants to see the picture in a larger range, the picture shot by the shooting equipment which deviates from the center of the stage by a certain angle can be used as the picture which can be seen by the user opposite to the screen, and then the user can move left and right to watch the picture at a nearby angle.
EXAMPLE III
Fig. 6 is a block diagram of a stereoscopic display apparatus for live performance according to a third embodiment of the present application, where the apparatus is capable of executing a stereoscopic display method for live performance according to any embodiment of the present application, and has functional modules and beneficial effects corresponding to the execution method. As shown in fig. 6, the apparatus may include:
the video stream acquisition module 610 is configured to perform binocular video shooting on at least two shooting intervals of the area to be shot to obtain binocular video streams of the at least two shooting intervals;
a position determining module 620 configured to store the binocular video streams of the at least two photographing sections and determine a viewing position according to a result of eye tracking;
the video display module 630 is configured to determine a target shooting interval corresponding to the viewing position, and display a binocular video stream obtained based on the target shooting interval in naked-eye 3D display equipment;
and the video switching module 640 is configured to respond to a shooting interval switching event of the detected watching position and control the naked eye 3D display device to display a binocular video stream obtained based on the switched shooting interval.
In one embodiment, the apparatus further comprises:
the adjacent interval determining module is arranged to determine an adjacent shooting interval of the target shooting interval corresponding to the watching position;
the video loading module is used for synchronously preloading binocular video streams obtained based on the adjacent shooting intervals in the process of displaying the binocular video streams obtained based on the target shooting intervals;
accordingly, the video switching module 640 includes:
a target adjacent section determining unit configured to determine a target adjacent shooting section in response to detection of a shooting section switching event of the viewing position;
and the video switching unit is arranged for synchronously switching the pre-loaded binocular video stream obtained based on the target adjacent shooting interval through the naked eye 3D display equipment.
In an embodiment, the video switching unit is further configured to: acquiring a frame blanking period of the naked eye 3D display device; and in response to detecting the shooting interval switching event of the watching position, switching from the current binocular video stream to a binocular video stream obtained based on the target adjacent shooting interval in the frame blanking period.
In one embodiment, the apparatus further comprises:
the parameter acquisition module is arranged for acquiring calibration parameters of the at least two shooting devices;
correspondingly, after obtaining the binocular video stream of at least two shooting intervals, the device further comprises:
and the parameter compensation module is used for carrying out calibration parameter compensation processing on the binocular video stream obtained by shooting of the at least two shooting devices based on the calibration parameters.
In one embodiment, the apparatus further comprises:
the shooting parameter acquisition module is used for acquiring the shooting parameters of the at least two shooting devices;
and the parameter processing module is set to carry out shooting parameter consistency processing on the binocular video stream obtained by shooting of the at least two shooting devices based on the shooting parameters.
The product can execute the stereoscopic display method for the field performance provided by the embodiment of the application, and has the corresponding functional modules and beneficial effects of the execution method.
Example four
A fourth embodiment of the present application provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the stereoscopic display method for live performance as provided in all embodiments of the present application:
carrying out binocular video shooting on at least two shooting intervals of a region to be shot to obtain binocular video streams of the at least two shooting intervals;
storing binocular video streams of the at least two shooting intervals, and determining a watching position according to a human eye tracking result;
determining a target shooting interval corresponding to the watching position, and displaying binocular video streams obtained based on the target shooting interval in naked eye 3D display equipment;
and controlling the naked eye 3D display equipment to display a binocular video stream obtained based on the switched shooting interval in response to the detection of the shooting interval switching event of the watching position.
Any combination of one or more computer-readable media may be employed. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a Read-Only Memory (ROM), an Erasable Programmable Read-Only Memory (EPROM), a flash Memory, an optical fiber, a portable Compact Disc Read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF (Radio Frequency), etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of Network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
EXAMPLE five
Fig. 7 is a block diagram of a stereoscopic display system for a live performance according to a fifth embodiment of the present application, where the system is capable of executing a stereoscopic display method for a live performance according to any embodiment of the present application, and has functional modules and beneficial effects corresponding to the execution method. As shown in fig. 7, the system may include:
at least two photographing devices 710, a processing device 720, a storage device 730, and a naked eye 3D display device 740; the at least two photographing devices 710, the storage device 730, and the naked-eye 3D display device 740 are all connected to the processing device 720, wherein:
the at least two shooting devices 710 are configured to perform binocular video shooting of live performance on at least two shooting intervals;
the storage device 730 is configured to store the binocular video streams of the at least two shooting intervals;
the naked eye 3D display device 740 is configured to determine a viewing position according to a human eye tracking result;
the processing device 720 is configured to determine a target shooting interval corresponding to the viewing position, and control the naked eye 3D display device to display a binocular video stream obtained based on the target shooting interval;
the processing device 720 is further configured to control the naked eye 3D display device to display a binocular video stream obtained based on the switched shooting interval in response to detecting a shooting interval switching event of the viewing position.
The system can execute the stereoscopic display method for the live performance provided by the embodiment of the application, and has the corresponding functional modules and beneficial effects of the execution method.
The embodiment of the application provides a stereoscopic display method, a stereoscopic display device, a stereoscopic display medium and a stereoscopic display system for on-site performance, which can enable a user to watch a stereoscopic video from any angle without wearing a head-mounted display, and improve the watching comfort of the user.
Note that the above are only some embodiments of the present application. Those skilled in the art will appreciate that the present application is not limited to the particular embodiments described herein, and that various obvious changes, rearrangements and substitutions will now become apparent to those skilled in the art without departing from the scope of the present application. Therefore, although the present application has been described in more detail with reference to the above embodiments, the present application is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present application is determined by the scope of the appended claims.

Claims (10)

  1. A method of stereoscopic display of a live performance, comprising:
    carrying out binocular video shooting on at least two shooting intervals of a region to be shot to obtain binocular video streams of the at least two shooting intervals;
    storing binocular video streams of the at least two shooting intervals, and determining a watching position according to a human eye tracking result;
    determining a target shooting interval corresponding to the watching position, and displaying binocular video streams obtained based on the target shooting interval in naked eye three-dimensional (3D) display equipment;
    and controlling the naked eye 3D display equipment to display binocular video streams obtained based on the switched shooting interval in response to the detection of the shooting interval switching event of the watching position.
  2. The method of claim 1, after determining a target shooting interval corresponding to the viewing position and displaying a binocular video stream obtained based on the target shooting interval in a naked-eye 3D display device, the method further comprising:
    determining a shooting interval adjacent to a target shooting interval corresponding to the watching position;
    in the process of displaying the binocular video stream obtained based on the target shooting interval, synchronously preloading the binocular video stream obtained based on the adjacent shooting interval;
    responding to the shooting interval switching event of the watching position, controlling the naked eye 3D display equipment to display a binocular video stream obtained based on the switched shooting interval, and comprising the following steps:
    in response to detecting a photographing section switching event of the viewing position, determining that the target is adjacent to a photographing section;
    and synchronously switching the pre-loaded binocular video stream obtained based on the target adjacent shooting interval through the naked eye 3D display equipment.
  3. The method of claim 2, wherein the synchronously switching, through the naked eye 3D display device, the pre-loaded binocular video stream obtained based on the target adjacent shooting interval comprises:
    acquiring a frame blanking period of the naked eye 3D display device;
    and in response to detecting the shooting interval switching event of the watching position, switching from the current binocular video stream to a binocular video stream obtained based on the target adjacent shooting interval in the frame blanking period.
  4. The method of claim 1, prior to binocular video filming via at least two filming intervals of the area to be filmed, the method further comprising:
    obtaining calibration parameters of at least two shooting devices;
    after binocular video shooting is performed on at least two shooting intervals of a region to be shot to obtain binocular video streams of the at least two shooting intervals, the method further comprises the following steps:
    and performing calibration parameter compensation processing on the binocular video stream obtained by shooting of the at least two shooting devices based on the calibration parameters.
  5. The method of claim 4, wherein in response to the calibration parameters including a vertical parallax existing between the at least two photographing devices, performing calibration parameter compensation processing on the binocular video stream photographed by the at least two photographing devices based on the calibration parameters comprises:
    based on the size of the vertical parallax, vertically calibrating a left image and a right image in the binocular video stream obtained by shooting of the at least two shooting devices; wherein the vertical calibration is to vertically shift the left image and the right image based on the vertical parallax so that the vertical parallax of the left image and the right image is eliminated.
  6. The method of claim 1, before obtaining the binocular video stream of the at least two photographing sections by binocular video photographing of the at least two photographing sections of the area to be photographed, further comprising:
    acquiring shooting parameters of at least two shooting devices;
    after binocular video shooting is performed on at least two shooting intervals of a region to be shot to obtain binocular video streams of the at least two shooting intervals, the method further comprises the following steps:
    and carrying out shooting parameter consistency processing on the binocular video stream obtained by shooting of the at least two shooting devices based on the shooting parameters.
  7. A stereoscopic display apparatus for a live performance, comprising:
    the video stream acquisition module is used for carrying out binocular video shooting on at least two shooting intervals of the area to be shot to obtain binocular video streams of the at least two shooting intervals;
    the position determining module is used for storing binocular video streams of the at least two shooting intervals and determining a watching position according to a human eye tracking result;
    the video display module is configured to determine a target shooting interval corresponding to the watching position and display binocular video streams obtained based on the target shooting interval in naked eye 3D display equipment;
    and the video switching module is used for responding to the shooting interval switching event of the detected watching position and controlling the naked eye 3D display equipment to display the binocular video stream obtained based on the switched shooting interval.
  8. The apparatus of claim 7, further comprising:
    the adjacent interval determining module is used for determining an adjacent shooting interval of the target shooting interval corresponding to the watching position;
    the video loading module is used for synchronously preloading binocular video streams obtained based on the adjacent shooting intervals in the process of displaying the binocular video streams obtained based on the target shooting intervals;
    the video switching module comprises:
    a target adjacent section determining unit configured to determine a target adjacent shooting section in response to detection of a shooting section switching event of the viewing position;
    and the video switching unit is arranged for synchronously switching the pre-loaded binocular video stream obtained based on the target adjacent shooting interval through the naked eye 3D display equipment.
  9. A computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements a method of stereoscopic display of a live performance according to any of claims 1-6.
  10. A stereoscopic display system for on-site performance comprises at least two shooting devices, a processing device, a storage device and a naked eye 3D display device; the at least two shooting devices, the storage device and the naked eye 3D display device are respectively connected with the processing device, wherein:
    the at least two shooting devices are set to carry out binocular video shooting of on-site performance on at least two shooting intervals;
    the storage equipment is used for storing the binocular video streams of the at least two shooting intervals;
    the naked eye 3D display equipment is set to determine a viewing position according to a human eye tracking result;
    the processing equipment is set to determine a target shooting interval corresponding to the watching position and control the naked eye 3D display equipment to display a binocular video stream obtained based on the target shooting interval;
    the processing device is further configured to control the naked eye 3D display device to display a binocular video stream obtained based on the switched shooting interval in response to detecting the shooting interval switching event of the viewing position.
CN202280003321.XA 2021-06-17 2022-06-17 Stereoscopic display method, device, medium and system for field performance Pending CN115668913A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN202110670702.0A CN113411561A (en) 2021-06-17 2021-06-17 Stereoscopic display method, device, medium and system for field performance
CN2021106707020 2021-06-17
PCT/CN2022/099369 WO2022262839A1 (en) 2021-06-17 2022-06-17 Stereoscopic display method and apparatus for live performance, medium, and system

Publications (1)

Publication Number Publication Date
CN115668913A true CN115668913A (en) 2023-01-31

Family

ID=77684718

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202110670702.0A Withdrawn CN113411561A (en) 2021-06-17 2021-06-17 Stereoscopic display method, device, medium and system for field performance
CN202280003321.XA Pending CN115668913A (en) 2021-06-17 2022-06-17 Stereoscopic display method, device, medium and system for field performance

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202110670702.0A Withdrawn CN113411561A (en) 2021-06-17 2021-06-17 Stereoscopic display method, device, medium and system for field performance

Country Status (2)

Country Link
CN (2) CN113411561A (en)
WO (1) WO2022262839A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113411561A (en) * 2021-06-17 2021-09-17 纵深视觉科技(南京)有限责任公司 Stereoscopic display method, device, medium and system for field performance
CN114422819A (en) * 2022-01-25 2022-04-29 纵深视觉科技(南京)有限责任公司 Video display method, device, equipment, system and medium
CN114979732B (en) * 2022-05-12 2023-10-20 咪咕数字传媒有限公司 Video stream pushing method and device, electronic equipment and medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101651841B (en) * 2008-08-13 2011-12-07 华为技术有限公司 Method, system and equipment for realizing stereo video communication
CN102497570A (en) * 2011-12-23 2012-06-13 天马微电子股份有限公司 Tracking-type stereo display device and display method thereof
CN104349155B (en) * 2014-11-25 2017-02-01 深圳超多维光电子有限公司 Method and equipment for displaying simulated three-dimensional image
CN107454381A (en) * 2017-06-22 2017-12-08 上海玮舟微电子科技有限公司 A kind of bore hole 3D display method and device
CN113411561A (en) * 2021-06-17 2021-09-17 纵深视觉科技(南京)有限责任公司 Stereoscopic display method, device, medium and system for field performance

Also Published As

Publication number Publication date
WO2022262839A1 (en) 2022-12-22
CN113411561A (en) 2021-09-17

Similar Documents

Publication Publication Date Title
US10750154B2 (en) Immersive stereoscopic video acquisition, encoding and virtual reality playback methods and apparatus
US9965026B2 (en) Interactive video display method, device, and system
US9167289B2 (en) Perspective display systems and methods
US9774896B2 (en) Network synchronized camera settings
EP3189657B1 (en) Method and apparatus for transmitting and/or playing back stereoscopic content
US10438633B2 (en) Method and system for low cost television production
CN115668913A (en) Stereoscopic display method, device, medium and system for field performance
US20150358539A1 (en) Mobile Virtual Reality Camera, Method, And System
CN108154058B (en) Graphic code display and position area determination method and device
US10681276B2 (en) Virtual reality video processing to compensate for movement of a camera during capture
US20100013738A1 (en) Image capture and display configuration
WO2003081921A1 (en) 3-dimensional image processing method and device
CA2933704A1 (en) Systems and methods for producing panoramic and stereoscopic videos
CN102970569A (en) Viewing area adjusting device, video processing device, and viewing area adjusting method
US10404964B2 (en) Method for processing media content and technical equipment for the same
CN109799899B (en) Interaction control method and device, storage medium and computer equipment
CN110730340B (en) Virtual audience display method, system and storage medium based on lens transformation
WO2017141584A1 (en) Information processing apparatus, information processing system, information processing method, and program
CN108563410B (en) Display control method and electronic equipment
CN114449303A (en) Live broadcast picture generation method and device, storage medium and electronic device
US20190335153A1 (en) Method for multi-camera device
US9258547B2 (en) Intelligent device with both recording and playing back 3D movies and the relevant apparatus and methods
US20150237335A1 (en) Three-Dimensional Television Calibration
CN108513122B (en) Model adjusting method and model generating device based on 3D imaging technology
EP2852149A1 (en) Method and apparatus for generation, processing and delivery of 3D video

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination