CN115996286A - Display control device and method, storage medium and electronic equipment - Google Patents

Display control device and method, storage medium and electronic equipment Download PDF

Info

Publication number
CN115996286A
CN115996286A CN202211599467.3A CN202211599467A CN115996286A CN 115996286 A CN115996286 A CN 115996286A CN 202211599467 A CN202211599467 A CN 202211599467A CN 115996286 A CN115996286 A CN 115996286A
Authority
CN
China
Prior art keywords
video source
current video
type
interleaving
display control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211599467.3A
Other languages
Chinese (zh)
Inventor
李培军
李志兴
高�豪
马瑞宇
薛海林
彭晓青
秦伟达
商世明
李艳云
朱劲野
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BOE Technology Group Co Ltd
Beijing BOE Display Technology Co Ltd
Original Assignee
BOE Technology Group Co Ltd
Beijing BOE Display Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BOE Technology Group Co Ltd, Beijing BOE Display Technology Co Ltd filed Critical BOE Technology Group Co Ltd
Priority to CN202211599467.3A priority Critical patent/CN115996286A/en
Publication of CN115996286A publication Critical patent/CN115996286A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

The disclosure relates to the technical field of 3D display, and in particular relates to a display control device and method, a storage medium and an electronic device. The device comprises: the video type identification module is used for identifying the video type of the current video source; the viewpoint type identification module is used for determining the viewpoint type corresponding to the current video source when the current video source is determined to be a 3D video source; and the image interleaving processing module is used for determining a corresponding interleaving operation strategy according to the viewpoint type, and locally interleaving the current video source based on the interleaving operation strategy so as to generate and display a corresponding naked eye 3D video. According to the scheme, the captured images of the screen pictures can be subjected to 3D multi-view interweaving locally, multi-view naked eye 3D images are directly output, and can be played by using a common player, so that no delay and high reliability are ensured.

Description

Display control device and method, storage medium and electronic equipment
Technical Field
The disclosure relates to the technical field of 3D display, and in particular, to a display control device, a display control method, a storage medium, and an electronic apparatus.
Background
Naked eye 3D generally refers to a display technology for realizing stereoscopic effect without using external tools such as polarized glasses. Users typically need to use a dedicated player when viewing 3D video; however, the dedicated player has a problem of delay caused by transmission of the video stream during video processing and playing. For conventional common players, 3D video cannot be played, and naked eye 3D video cannot be played.
It should be noted that the information disclosed in the above background section is only for enhancing understanding of the background of the present disclosure and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The present disclosure provides a display control apparatus, a display control method, a storage medium, and an electronic device capable of solving problems existing in the prior art to some extent.
Other features and advantages of the present disclosure will be apparent from the following detailed description, or may be learned in part by the practice of the disclosure.
According to a first aspect of the present disclosure, there is provided a display control apparatus including:
the video type identification module is used for identifying the video type of the current video source;
the viewpoint type identification module is used for determining the viewpoint type corresponding to the current video source when the current video source is determined to be a 3D video source;
and the image interleaving processing module is used for determining a corresponding interleaving operation strategy according to the viewpoint type, and locally interleaving the current video source based on the interleaving operation strategy so as to generate and display a corresponding naked eye 3D video.
In some exemplary embodiments, the video type identification module includes:
the file identification module is used for reading the file identification information of the current video source so as to determine the video type of the current video source; the file identification information of the current video source is configured according to a preset rule;
and the input information identification module is used for determining the video type of the current video source according to the input video type information corresponding to the current video source.
In some exemplary embodiments, the apparatus further comprises:
and the 3D playing mode executing module is used for executing a 3D playing mode when the current video source is determined to be a 3D video source, so as to determine the viewpoint type of the 3D video source.
In some exemplary embodiments, the viewpoint type recognition module includes:
and the image recognition execution module is used for acquiring the target image frame corresponding to the current video source, and carrying out image recognition on the target image frame so as to determine the viewpoint type of the current video source according to the image recognition result.
In some exemplary embodiments, the image interleaving processing module includes:
and the interleaving service calling module is used for calling the 3D interleaving service, carrying out image capturing on the current video source frame by frame, and locally carrying out interleaving processing on the image frames corresponding to the current video source frame by frame according to the interleaving operation strategy so as to generate the multi-view naked eye 3D video.
In some exemplary embodiments, the apparatus further comprises:
the display interface control module is used for displaying the current video source in the first display interface and collecting corresponding image frames; and displaying the naked eye 3D video in a second display interface.
In some exemplary embodiments, the viewpoint types corresponding to the 3D video source include: any one of a left-right view format, an up-down view format, an eight view format, and a nine view format.
In some exemplary embodiments, the apparatus further comprises:
and the 2D playing mode executing module is used for executing a 2D playing mode when the current video source is determined to be the 2D video source so as to directly play the current video source.
According to a second aspect of the present disclosure, there is provided a display control method, the method including:
identifying the video type of the current video source;
when the current video source is determined to be a 3D video source, determining a viewpoint type corresponding to the current video source;
and determining a corresponding interleaving operation strategy according to the viewpoint type, and locally interleaving the current video source based on the interleaving operation strategy to generate and display a corresponding naked eye 3D video.
According to a third aspect of the present disclosure, there is provided a storage medium having stored thereon a computer program which, when executed by a processor, implements the display control method described above.
According to a fourth aspect of the present disclosure, there is provided an electronic device comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to implement the display control method described above via execution of the executable instructions.
According to the display control device and the display control method provided by the embodiment of the disclosure, the current video source is judged to be a 2D video or a 3D video by identifying the video type of the current video source to be played; when the current video source is determined to be the 3D video, the video type can be identified, the corresponding interleaving operation strategy can be determined, interleaving processing is carried out on the current video locally, so that the corresponding 3D naked eye video is generated, and the video can be played by using a local common player.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure. It will be apparent to those of ordinary skill in the art that the drawings in the following description are merely examples of the disclosure and that other drawings may be derived from them without undue effort.
Fig. 1 schematically illustrates a composition diagram of a display control apparatus in an exemplary embodiment of the present disclosure;
FIG. 2 schematically illustrates a schematic view of a video source image screen capture of an exemplary embodiment of the present disclosure;
fig. 3 schematically illustrates a schematic diagram of a nine-viewpoint type RGB pixel-array diagram according to an exemplary embodiment of the present disclosure;
FIG. 4 schematically illustrates a schematic diagram of an image input after an interleaving operation in an exemplary embodiment of the present disclosure;
fig. 5 schematically illustrates a flowchart of a control method corresponding to a display control device in an exemplary embodiment of the disclosure;
FIG. 6 schematically illustrates a schematic diagram of a display control method in an exemplary embodiment of the present disclosure;
fig. 7 schematically illustrates a schematic diagram of another display control method in an exemplary embodiment of the present disclosure;
fig. 8 schematically illustrates a composition diagram of an electronic device in an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus a repetitive description thereof will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in software or in one or more hardware modules or integrated circuits or in different networks and/or processor devices and/or microcontroller devices.
In order to overcome the disadvantages and shortcomings of the prior art, the present exemplary embodiment provides a display control device, which can be applied to display control of a general 2D video and a naked eye 3D video. Referring to fig. 1, the display control apparatus described above may include: a video type identification module 101, a viewpoint type identification module 102, and an image interleaving processing module 103. Wherein, the liquid crystal display device comprises a liquid crystal display device,
the video type identification module 101 may be configured to identify a video type of a current video source.
The viewpoint type identifying module 102 may be configured to determine, when determining that the current video source is a 3D video source, a viewpoint type corresponding to the current video source.
The image interleaving processing module 103 may be configured to determine a corresponding interleaving operation policy according to the viewpoint type, and locally perform interleaving processing on the current video source based on the interleaving operation policy, so as to generate and display a corresponding naked eye 3D video.
The display control device provided in this example embodiment may identify the video type of the current video source to be played by using the video type identifying module. When the current video source is determined to be a 3D video, the corresponding viewpoint type can be determined by using the viewpoint type identification module, and the image interleaving processing module can determine the corresponding interleaving operation strategy according to the viewpoint type; the current video may then be locally interleaved to generate a corresponding naked eye 3D video. After the naked eye 3D video is generated, the naked eye 3D video can be played by using a local common player, so that the purpose of playing the naked eye 3D video by using the common player is realized. And, because the processing procedure of the 3D video is configured to be processed locally at the terminal equipment, delay is avoided.
Next, each constituent element of the display control apparatus in the present exemplary embodiment will be described in more detail with reference to the drawings and examples.
In this example embodiment, in the video type identification module 101, it may be used to identify the video type of the current video source.
Specifically, the current video source may be a video file to be played, which is selected by a user on the terminal device. For example, the terminal device may be an intelligent terminal device such as a mobile phone, a tablet computer, a computer, etc. The video file to be played selected by the user can be a video file locally stored in the terminal device, or can be a video link selected to be played in a webpage. When a user selects a video file in an interactive interface of the terminal equipment, a video type identification module can be triggered to identify the type of a current video source first, and the current video source is judged to be a common 2D video or a 3D video.
In some example embodiments, the video type identification module may include: a file identification module and/or an input information identification module.
The file identification module may be configured to read file identification information of the current video source to determine a video type of the current video source; the file identification information of the current video source is configured according to a preset rule.
Specifically, the file identification information may be a file name. For video files or videos in webpages, corresponding file names can be configured in advance according to rules, and 3D identifiers are added in the file names. For example, for 3D video "torrent" its file name may be configured as "torrent-3 d.mp4", or "torrent (3D). Mp4"; for a video webpage, the file name of the video can be extracted from the webpage, and the identification and judgment can be performed. Or if the file name of the read video file contains the 3D identifier, it may be determined that the video is a video in a normal format, and is not a 3D video.
In some exemplary embodiments, the input information identifying module may be configured to determine a video type of the current video source according to the input video type information corresponding to the current video source.
Specifically, after the user selects the video file in the interactive interface of the terminal device, an interactive interface may be provided for the user to input the video type information of the current video source. For example, when the user clicks and opens a video file with a file name of "video in the interactive interface, mp4", the player may first display an interactive interface confirming the video type, and the user may input/select whether the video type of the current video source is 2D video or 3D video. Or, the provided interactive interface can be used for the user to select the video source to be played before the video is played, and simultaneously select the corresponding video type; for example, a user may input a video source to be played and a corresponding video type in the interactive interface; so that the video type of the current video source can be determined from the information entered by the user.
For example, when a user selects a video file, the file identification module may be triggered first to identify the file name. If the identification fails, namely, when the video type information is not identified in the file name, or when the user inputs information in an input window of the interactive interface, an input information identification model can be triggered to identify the video type information input by the user; so that the video type of the current video source can be determined.
In this example embodiment, the viewpoint type identifying module 102 may be configured to determine, when determining that the current video source is a 3D video source, a viewpoint type corresponding to the current video source.
In this example embodiment, the apparatus further includes: and the 3D playing mode executing module is used for executing a 3D playing mode when the current video source is determined to be a 3D video source, so as to determine the viewpoint type of the 3D video source.
Specifically, after the video type recognition module determines that the current video source is the 3D video and outputs the video type recognition result, the video type recognition module may be triggered to determine the viewpoint type corresponding to the current 3D video. Specifically, the video type identification module triggers the 3D playing mode executing module to execute the 3D playing mode.
The viewpoint type may refer to a disparity map arrangement type corresponding to a 3D video source. In general, viewpoint types of 3D video may include: any one of a left-right view format, an up-down view format, an eight view format, and a nine view format. In addition, a 2d+ depth format is also possible.
Specifically, when the current video source is determined to be a 3D video source, the 3D play mode execution module may be activated to execute the 3D play mode, and trigger a task of identifying the viewpoint type of the video source, for determining the viewpoint type of the video source.
In this example embodiment, the viewpoint type identification module may include: and the image recognition execution module is used for acquiring the target image frame corresponding to the current video source, and carrying out image recognition on the target image frame so as to determine the viewpoint type of the current video source according to the image recognition result.
Specifically, the first frame image of the current video source can be acquired, the first frame image is taken as a target image frame, image recognition is carried out on the first frame image, and the number of identical objects but different viewing angles contained in the image is judged, so that the viewpoint type is determined. For example, an image capture service may be invoked to first capture a first frame of image as a target image frame after the current video source begins to play, and to perform image content recognition. For example, referring to the image frame shown in fig. 2, 9 identical mobile phone articles are included, and there is a certain difference in the viewing angle of each mobile phone article; and determining that the view type is nine view types according to the identification result. For example, an image viewpoint recognition model based on a convolutional neural network may be trained in advance, the acquired first frame image is input into the image viewpoint recognition model, and the corresponding viewpoint type recognition result is output. The convolutional neural network-based image recognition algorithm can be trained by collecting conventional means, and the specific training process of the model and the hierarchical structure of the model are not particularly limited.
Alternatively, in some exemplary embodiments, the viewpoint type of the current video source may be input in the interactive interface by the user; so that the viewpoint type of the current video source can be determined according to the information input by the user.
In this example embodiment, the image interleaving processing module 103 may be configured to determine a corresponding interleaving operation policy according to the viewpoint type, and locally perform interleaving processing on the current video source based on the interleaving operation policy, so as to generate and display a corresponding naked-eye 3D video.
Specifically, for 3D video of different viewpoint types, different interleaving policies may be employed to generate corresponding naked eye 3D video. Specifically, a mapping relationship between the viewpoint type and the corresponding interleaving operation policy may be preconfigured. After the current video source is identified as the 3D video and the corresponding viewpoint type is determined, an interleaving operation strategy for converting the 3D video into the naked eye 3D video can be determined.
In this example embodiment, the image interleaving processing module 103 may include: the interleaving service calling module can be used for calling 3D interleaving service, performing image capture on the current video source frame by frame, and locally performing interleaving processing on the image frames corresponding to the current video source frame by frame according to the interleaving operation strategy so as to generate multi-view naked eye 3D video.
Specifically, a preconfigured 3D interleaving service may be invoked, and the size of a window recorded by configuring a screen through a built-in canvas in a Unity engine environment is the size of the whole screen of the terminal device, and the corresponding resolution is set. For a video source, capturing full screen pictures of a current screen with a set window size frame by frame as input data of interleaving operation.
The frame-by-frame captured full-screen image can be locally subjected to interleaving operation by the terminal according to the determined interleaving operation strategy. Specifically, the interleaving operation may be to set the input image to the regular and periodic pattern of the pixel array according to the number of views of the 3D terminal display and the corresponding optical parameters, sample RGB in the pixel points of the input image, and match the RGB according to the periodic pattern of the pixel array. For example, referring to the RGB row diagram of nine viewpoints shown in fig. 3, if the number marked on the R of the first row and the first column is 9, R of the pixel point of the input nine-viewpoint type image at the position may be assigned to R in the row diagram, and the operation is repeated to assign all RGB of all the pixel points in the row diagram, so as to complete the 3D interleaving process, and thus a view of the naked eye 3D effect may be obtained, for example, as shown in fig. 4.
For different viewpoint types, such as images of left and right viewpoint types and images of nine-grid viewpoint types, the arrangement rules are different, and the corresponding interleaving operation strategies have certain differences, but the principle is that pixels are sampled and rearranged according to certain rules and periods. For example, a loader scheme in the Unity engine can be used, and interleaving of images with different viewpoint types can be realized through parameter distinction, so that 3D interleaving is well compatible with video sources with different viewpoint types at present.
In this example embodiment, the apparatus may further include: and the 2D playing mode executing module can be used for executing a 2D playing mode when the current video source is determined to be the 2D video source so as to directly play the current video source.
Specifically, when the 2D video is identified according to the identification information of the current video source or the input information of the user, the video can be directly played through the player.
In this example embodiment, the apparatus may further include: the display interface control module is used for displaying the current video source in the first display interface and collecting corresponding image frames; and displaying the naked eye 3D video in a second display interface.
Specifically, two display interfaces can be provided in the interactive interface, wherein the first display interface can be used for displaying the current video source and displaying the original display effect of the video; the second display interface may be used to display 2D video or transformed naked eye 3D video. For example, when the terminal device is configured with a plurality of displays, a first display interface may be configured to be displayed in a first display, and a second display interface may be configured to be displayed in a second display; alternatively, in one display, the first display interface may be configured as a display interface with a smaller window size, and the second display interface may be configured as a display interface with a larger window size. Specifically, when a user selects a video file and opens the video file, the display control module may control the video file to be displayed in the first display interface first; and an interactive interface for inputting video type and viewpoint type by a user can be displayed in the first display interface and/or the second display interface. If the current video source is identified as the 2D video, the current video source can be directly displayed in the first display interface and the second display interface; during the playing process of the video source, the first display interface can be automatically hidden. Or when the current video source is a 3D video, displaying the original 3D video in a first display interface and performing image capturing; displaying the naked eye 3D video after interleaving treatment in a second display interface; the first display interface may be automatically hidden. For example, the first display interface may be hidden in a minimized form; alternatively, the first display interface may be hidden in a manner that the second display interface is set to remain at the forefront all the time.
The display control device provided by the embodiment of the disclosure can be realized and executed in a system application or plug-in mode, and when a user selects a video file to play or selects a link to play in a webpage, the system application or plug-in can be triggered. Referring to the flow shown in fig. 5, after the user selects the video source and starts playing the video by the default normal player, the control logic corresponding to each module of the display control device can capture the screen content of the player as input; judging the video type, if judging the video type is 2D video, directly executing a 2D playing mode, and correctly displaying the 2D video on display equipment; or when the current video source is judged to be 3D video, the 3D interleaving service can be triggered, and the multi-view 3D interleaving processing is carried out on the captured screen content image locally. The current display content of the screen is obtained in a frame-by-frame acquisition mode, the 3D playing mode is adjusted, the interleaving operation strategy is determined, the 3D multi-view interleaving processing is carried out on the images of the captured screen picture locally, the multi-view naked eye 3D images are directly output, and the images can be played by using a common player, so that no delay and high reliability are ensured. The display device may be a TPC tablet supporting naked eye 3D, an MNT display, or a large-sized TV, etc. In addition, the image acquired by screen recording is used as the input of the application to ensure the continuity of image interleaving output, so that the problem of high delay when the player plays live streams is solved, and the problem that the traditional video player cannot play naked eye 3D video is also effectively solved.
It should be noted that although in the above detailed description several modules or units of a device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit in accordance with embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into a plurality of modules or units to be embodied.
Further, the embodiment of the present example also provides a display control method, which can be applied to the display control device described above. Referring to fig. 6, the display control method includes:
step S11, identifying the video type of the current video source;
step S12, when the current video source is determined to be a 3D video source, determining a viewpoint type corresponding to the current video source;
and step S13, determining a corresponding interleaving operation strategy according to the viewpoint type, and locally interleaving the current video source based on the interleaving operation strategy to generate and display a corresponding naked eye 3D video.
In some exemplary embodiments, the identifying the video type of the current video source includes:
reading file identification information of the current video source to determine the video type of the current video source; the file identification information of the current video source is configured according to a preset rule; or alternatively
And determining the video type of the current video source according to the input video type information corresponding to the current video source.
In some exemplary embodiments, when the current video source is determined to be a 3D video source, the method further comprises:
a 3D play mode is performed for determining a view type of the 3D video source.
In some exemplary embodiments, the determining the viewpoint type corresponding to the current video source includes:
and acquiring a target image frame corresponding to the current video source, and carrying out image recognition on the target image frame so as to determine the viewpoint type of the current video source according to an image recognition result.
In some exemplary embodiments, the interleaving the current video source based on the interleaving policy includes:
and calling a 3D interleaving service, performing image capturing on the current video source frame by frame, and locally performing interleaving processing on the image frames corresponding to the current video source frame by frame according to the interleaving operation strategy so as to generate the multi-view naked eye 3D video.
In some exemplary embodiments, the method further comprises:
displaying the current video source in a first display interface and collecting corresponding image frames; and
and displaying the naked eye 3D video in a second display interface.
In some exemplary embodiments, the viewpoint types corresponding to the 3D video source include: any one of a left-right view format, an up-down view format, an eight view format, and a nine view format.
In some exemplary embodiments, referring to fig. 7, the method further comprises:
step S71, identifying the video type of the current video source;
step S72, when the current video source is determined to be a 2D video source, executing a 2D playing mode to directly play the current video source.
The specific details of each step in the above display control method are described in detail in the corresponding display control device, so that the details are not repeated here.
It is noted that the above-described figures are only schematic illustrations of processes involved in a method according to an exemplary embodiment of the invention, and are not intended to be limiting. It will be readily appreciated that the processes shown in the above figures do not indicate or limit the temporal order of these processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, for example, among a plurality of modules.
Fig. 8 shows a schematic diagram of an electronic device suitable for use in implementing embodiments of the invention.
It should be noted that, the electronic device 1000 shown in fig. 8 is only an example, and should not impose any limitation on the functions and the application scope of the embodiments of the present disclosure.
As shown in fig. 8, the electronic apparatus 1000 includes a central processing unit (Central Processing Unit, CPU) 1001 that can perform various appropriate actions and processes according to a program stored in a Read-Only Memory (ROM) 1002 or a program loaded from a storage section 1008 into a random access Memory (Random Access Memory, RAM) 1003. In the RAM 1003, various programs and data required for system operation are also stored. The CPU 1001, ROM 1002, and RAM 1003 are connected to each other by a bus 1004. An Input/Output (I/O) interface 1005 is also connected to bus 1004.
The following components are connected to the I/O interface 1005: an input section 1006 including a keyboard, a mouse, and the like; an output portion 1007 including a Cathode Ray Tube (CRT), a liquid crystal display (Liquid Crystal Display, LCD), and a speaker; a storage portion 1008 including a hard disk or the like; and a communication section 1009 including a network interface card such as a LAN (Local Area Network ) card, a modem, or the like. The communication section 1009 performs communication processing via a network such as the internet. The drive 1010 is also connected to the I/O interface 1005 as needed. A removable medium 1011, such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like, is installed on the drive 1010 as needed, so that a computer program read out therefrom is installed into the storage section 1008 as needed.
In particular, according to embodiments of the present invention, the processes described below with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present invention include a computer program product comprising a computer program loaded on a storage medium, the computer program comprising program code for performing the method shown in the flowchart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication portion 1009, and/or installed from the removable medium 1011. When executed by a Central Processing Unit (CPU) 1001, the computer program performs various functions defined in the system of the present application.
Specifically, the electronic device may be an intelligent mobile electronic device such as a mobile phone, a tablet computer or a notebook computer. Alternatively, the electronic device may be an intelligent electronic device such as a desktop computer.
It should be noted that, the storage medium shown in the embodiments of the present invention may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-Only Memory (ROM), an erasable programmable read-Only Memory (Erasable Programmable Read Only Memory, EPROM), flash Memory, an optical fiber, a portable compact disc read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present invention, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any storage medium that is not a computer readable storage medium and that can transmit, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a storage medium may be transmitted using any appropriate medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present invention may be implemented by software, or may be implemented by hardware, and the described units may also be provided in a processor. Wherein the names of the units do not constitute a limitation of the units themselves in some cases.
It should be noted that, as another aspect, the present application further provides a storage medium, which may be included in an electronic device; or may exist alone without being incorporated into the electronic device. The storage medium carries one or more programs which, when executed by an electronic device, cause the electronic device to implement the methods described in the embodiments below. For example, the electronic device may implement the steps shown in fig. 1.
Furthermore, the above-described drawings are only schematic illustrations of processes included in the method according to the exemplary embodiment of the present invention, and are not intended to be limiting. It will be readily appreciated that the processes shown in the above figures do not indicate or limit the temporal order of these processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, for example, among a plurality of modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (11)

1. A display control apparatus, characterized in that the apparatus comprises:
the video type identification module is used for identifying the video type of the current video source;
the viewpoint type identification module is used for determining the viewpoint type corresponding to the current video source when the current video source is determined to be a 3D video source;
and the image interleaving processing module is used for determining a corresponding interleaving operation strategy according to the viewpoint type, and locally interleaving the current video source based on the interleaving operation strategy so as to generate and display a corresponding naked eye 3D video.
2. The display control device according to claim 1, wherein the video type recognition module includes:
the file identification module is used for reading the file identification information of the current video source so as to determine the video type of the current video source; the file identification information of the current video source is configured according to a preset rule;
and the input information identification module is used for determining the video type of the current video source according to the input video type information corresponding to the current video source.
3. The display control apparatus according to claim 1, characterized in that the apparatus further comprises:
and the 3D playing mode executing module is used for executing a 3D playing mode when the current video source is determined to be a 3D video source, so as to determine the viewpoint type of the 3D video source.
4. A display control apparatus according to claim 1 or 3, wherein the viewpoint type recognition module includes:
and the image recognition execution module is used for acquiring the target image frame corresponding to the current video source, and carrying out image recognition on the target image frame so as to determine the viewpoint type of the current video source according to the image recognition result.
5. The display control device according to claim 1, wherein the image interleaving processing module includes:
and the interleaving service calling module is used for calling the 3D interleaving service, carrying out image capturing on the current video source frame by frame, and locally carrying out interleaving processing on the image frames corresponding to the current video source frame by frame according to the interleaving operation strategy so as to generate the multi-view naked eye 3D video.
6. The display control apparatus according to claim 1, characterized in that the apparatus further comprises:
the display interface control module is used for displaying the current video source in the first display interface and collecting corresponding image frames; and displaying the naked eye 3D video in a second display interface.
7. The display control apparatus according to claim 1, wherein the viewpoint type corresponding to the 3D video source includes: any one of a left-right view format, an up-down view format, an eight view format, and a nine view format.
8. The display control method according to claim 1, characterized in that the apparatus further comprises:
and the 2D playing mode executing module is used for executing a 2D playing mode when the current video source is determined to be the 2D video source so as to directly play the current video source.
9. A display control method, characterized in that the method comprises:
identifying the video type of the current video source;
when the current video source is determined to be a 3D video source, determining a viewpoint type corresponding to the current video source;
and determining a corresponding interleaving operation strategy according to the viewpoint type, and locally interleaving the current video source based on the interleaving operation strategy to generate and display a corresponding naked eye 3D video.
10. A storage medium having stored thereon a computer program which, when executed by a processor, implements the display control method as claimed in claim 9.
11. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the display control method of claim 9 via execution of the executable instructions.
CN202211599467.3A 2022-12-12 2022-12-12 Display control device and method, storage medium and electronic equipment Pending CN115996286A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211599467.3A CN115996286A (en) 2022-12-12 2022-12-12 Display control device and method, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211599467.3A CN115996286A (en) 2022-12-12 2022-12-12 Display control device and method, storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN115996286A true CN115996286A (en) 2023-04-21

Family

ID=85993275

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211599467.3A Pending CN115996286A (en) 2022-12-12 2022-12-12 Display control device and method, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN115996286A (en)

Similar Documents

Publication Publication Date Title
CN109803175B (en) Video processing method and device, video processing equipment and storage medium
US9485493B2 (en) Method and system for displaying multi-viewpoint images and non-transitory computer readable storage medium thereof
CN112954450B (en) Video processing method and device, electronic equipment and storage medium
CN104618803A (en) Information push method, information push device, terminal and server
US10935788B2 (en) Hybrid virtual 3D rendering approach to stereovision
CN110062157B (en) Method and device for rendering image, electronic equipment and computer readable storage medium
CN109640180B (en) Method, device, equipment, terminal, server and storage medium for 3D display of video
US11785195B2 (en) Method and apparatus for processing three-dimensional video, readable storage medium and electronic device
CN109121000A (en) A kind of method for processing video frequency and client
CN105554430A (en) Video call method, system and device
CN104023181A (en) Information processing method and device
Li et al. Enhancing 3d applications using stereoscopic 3d and motion parallax
CN111246196B (en) Video processing method and device, electronic equipment and computer readable storage medium
CN109743566A (en) A kind of method and apparatus of the video format of VR for identification
CN112163993A (en) Image processing method, device, equipment and storage medium
CN114449303A (en) Live broadcast picture generation method and device, storage medium and electronic device
CN109922326B (en) Method, device, medium and equipment for determining resolution of naked eye 3D video image
CN111506241A (en) Special effect display method and device for live broadcast room, electronic equipment and computer medium
CN109816791B (en) Method and apparatus for generating information
CN116489424A (en) Live background generation method and device, electronic equipment and computer readable medium
CN107491934B (en) 3D interview system based on virtual reality
CN115996286A (en) Display control device and method, storage medium and electronic equipment
CN113810755A (en) Panoramic video preview method and device, electronic equipment and storage medium
CN112083901A (en) Image processing method, device, equipment and storage medium
CN112055246B (en) Video processing method, device and system and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination