CN114630101A - Display device, VR device and display control method of virtual reality application content - Google Patents

Display device, VR device and display control method of virtual reality application content Download PDF

Info

Publication number
CN114630101A
CN114630101A CN202210190102.9A CN202210190102A CN114630101A CN 114630101 A CN114630101 A CN 114630101A CN 202210190102 A CN202210190102 A CN 202210190102A CN 114630101 A CN114630101 A CN 114630101A
Authority
CN
China
Prior art keywords
display
application
image
display device
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210190102.9A
Other languages
Chinese (zh)
Inventor
贾亚洲
丁国耀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Publication of CN114630101A publication Critical patent/CN114630101A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type

Abstract

The application relates to the technical field of display equipment, in particular to display control method of display equipment, VR equipment and virtual reality application content, and the problem that the VR equipment cannot be used for 3D display when a user uses the VR equipment to watch a 3D application operated by the display equipment can be solved to a certain extent. The display device includes: a display for displaying a user interface; the image acquisition interface is used for acquiring a user image; a first controller configured to: determining an application type of foreground application in the user interface when the VR device is detected to be connected to the display device; when the application type is a first display format type, acquiring an application background image displayed in a user interface by foreground application and a user image acquired by an image acquisition interface; and sending the application background image and the user image to the VR equipment, and rendering the application background image and the user image in the VR equipment to enable the VR equipment to display a 3D picture.

Description

Display device, VR device and display control method of virtual reality application content
The present application claims priority of chinese patent application entitled "a display device and network connection method" filed by chinese patent office on 30/9/2021 under the reference of 202111164454.9, the entire contents of which are incorporated herein by reference.
Technical Field
The application relates to the technical field of display equipment, in particular to display equipment, VR equipment and a display control method of virtual reality application content.
Background
The VR technology is a human-computer interaction technology, and can render behavior actions of a user into a virtual picture generated by a computer, so that the user has an immersive feeling.
In some display implementations of tv virtual reality application content, for example, a tv typically runs a 3D application, and a tv screen can display a 2D picture; when the VR glasses are connected to the television, the television records the 2D pictures played by the user interface of the television, and then the recorded 2D pictures are transmitted to the VR glasses through the screen recording interface of the television, so that the VR glasses play the application contents.
However, since the data transmitted to the VR glasses by the television in the above scheme is television screen recording data, the watching picture of the user is still a 2D picture when the user watches the television 3D application through the VR glasses.
Disclosure of Invention
In order to solve the problem that when a user uses VR equipment to watch virtual reality application running on display equipment, the VR equipment cannot perform 3D display, the application provides the display equipment, the VR equipment and a display control method of virtual reality application content.
The embodiment of the application is realized as follows:
a first aspect of an embodiment of the present application provides a display device, including: a display for displaying a user interface; the image acquisition interface is used for acquiring a user image; a first controller configured to: determining an application type of foreground application in the user interface when the VR device is detected to be connected to the display device; when the application type is a first display format type, acquiring an application background image displayed in the user interface by the foreground application and the user image acquired by the image acquisition interface; and sending the application background image and the user image to the VR equipment, and rendering the application background image and the user image in the VR equipment so that the VR equipment displays a 3D picture.
A second aspect of an embodiment of the present application provides a VR device, including: a second controller configured to: when the display equipment with the foreground application in the first display format type is connected, receiving an application background image of the user interface foreground application and a user image which are sent by the display equipment, wherein the user image is acquired by an image acquisition interface of the display equipment; and controlling a VR device user interface to display a 3D picture generated by rendering according to the application background image and the user image.
A third aspect of the embodiments of the present application provides a method for controlling display of virtual reality application content, where the method includes: determining the application type of foreground application when the connection of VR equipment is detected; when the application type is a first display format type, acquiring an application background image displayed by the foreground application and a user image acquired by an image acquisition interface; and sending the application background image and the user image to the VR equipment, and rendering the application background image and the user image in the VR equipment to enable the VR equipment to display a 3D picture.
A fourth aspect of the embodiments of the present application provides a method for controlling display of virtual reality application content, where the method includes: when the display device is connected with a display device with a foreground application of a first display format type, receiving an application background image of a user interface foreground application and a user image which are sent by the display device, wherein the user image is acquired by an image acquisition interface; and displaying a 3D picture generated by rendering according to the application background image and the user image.
The beneficial effects of the application are that; different linkage schemes of the VR equipment and the display equipment can be realized by determining the foreground application type; further, by acquiring foreground application background images and user images, the virtual reality application content of the display equipment can be displayed in the VR equipment in a 3D mode; further through constructing the dynamic interaction of VR equipment and display device, VR equipment can be used to operate foreground application, and synchronous display of display device and VR equipment is realized.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to these drawings without inventive exercise.
Fig. 1 is a schematic diagram of an operation scenario between a display device and a control apparatus according to one or more embodiments of the present application;
fig. 2 is a block diagram of a hardware configuration of a display apparatus 200 according to one or more embodiments of the present application;
fig. 3 is a block diagram of a hardware configuration of the control apparatus 100 according to one or more embodiments of the present application;
fig. 4 is a schematic diagram of a software configuration in a display device 200 according to one or more embodiments of the present application;
FIG. 5A is a logic diagram of a display device and VR glasses implementing displaying different types of application content according to an embodiment of the present application;
fig. 5B is a schematic diagram of a system framework of a display device and VR glasses for displaying virtual reality application content according to another embodiment of the present application;
fig. 5C is a schematic flow chart illustrating a display device according to another embodiment of the present application implementing a content recording screen for a flat display application;
fig. 5D is a schematic flowchart illustrating a display device according to another embodiment of the present application implementing a content recording screen for a flat panel display application;
fig. 5E is a schematic diagram of a system framework for data transmission by a display device and VR glasses according to another embodiment of the present application;
fig. 5F is a schematic flowchart illustrating a display device and VR glasses implementing virtual reality application content display according to another embodiment of the present application;
FIG. 5G is a system framework diagram of a display device and VR glasses for displaying virtual reality application content according to another embodiment of the present application;
fig. 5H illustrates a logic diagram of VR glasses controlling display of virtual reality application content by a display device according to another embodiment of the present application;
FIG. 6A is a logic diagram illustrating network card index acquisition according to another embodiment of the present application;
FIG. 6B is a diagram illustrating a multi-NIC data transmission according to another embodiment of the present application;
FIG. 7A is a schematic diagram of a user interface for displaying device data according to an embodiment of the present application;
fig. 7B is a schematic diagram illustrating a structure and a user interface of a display device and VR glasses for data transmission according to another embodiment of the application;
fig. 7C is a schematic diagram illustrating an architecture and a user interface of a display device and VR glasses for data transmission according to another embodiment of the present application;
FIG. 7D is a diagram illustrating an architecture and user interface for data transmission between display devices according to another embodiment of the present application;
fig. 7E shows a schematic diagram of a display device, VR glasses, and a user interface for data transmission according to another embodiment of the present application.
Detailed Description
To make the objects, embodiments and advantages of the present application clearer, the following is a clear and complete description of exemplary embodiments of the present application with reference to the attached drawings in exemplary embodiments of the present application, and it is apparent that the exemplary embodiments described are only a part of the embodiments of the present application, and not all of the embodiments.
It should be noted that the brief descriptions of the terms in the present application are only for the convenience of understanding the embodiments described below, and are not intended to limit the embodiments of the present application. These terms should be understood in their ordinary and customary meaning unless otherwise indicated.
The terms "first," "second," "third," and the like in the description and claims of this application and in the above-described drawings are used for distinguishing between similar or analogous objects or entities and not necessarily for describing a particular sequential or chronological order, unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances.
The terms "comprises" and "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to all elements expressly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
Fig. 1 is a schematic diagram of an operation scenario between a display device and a control apparatus according to an embodiment. As shown in fig. 1, a user may operate the display device 200 through a smart device 300, a VR device 500, or a control apparatus 100, the VR device 500 may be implemented as VR glasses, or a VR head-display, for example, in some embodiments.
In some embodiments, the control apparatus 100 may be a remote controller, and the communication between the remote controller and the display device includes an infrared protocol communication or a bluetooth protocol communication, and other short-distance communication methods, and controls the display device 200 in a wireless or wired manner. The user may input a user instruction through a key on a remote controller, voice input, control panel input, etc., to control the display apparatus 200.
In some embodiments, the display device 200 is also in data communication with a server 400. The display device 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display apparatus 200. The server 400 may be a cluster or a plurality of clusters, and may include one or more types of servers.
Fig. 2 exemplarily shows a block diagram of a configuration of the control apparatus 100 according to an exemplary embodiment. As shown in fig. 2, the control device 100 includes a controller 110, a communication interface 130, a user input/output interface 140, a memory, and a power supply. The control apparatus 100 may receive an input operation instruction from a user and convert the operation instruction into an instruction recognizable and responsive by the display device 200, serving as an interaction intermediary between the user and the display device 200.
As shown in fig. 3, the display apparatus 200 includes at least one of a tuner demodulator 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a display 260, an audio output interface 270, a memory, a power supply, and a user interface.
The display 260 may be a liquid crystal display, an OLED display, and a projection display, and may also be a projection device and a projection screen.
The communicator 220 is a component for communicating with an external device or a server according to various communication protocol types. For example: the communicator may include at least one of a Wifi module, a bluetooth module, a wired ethernet module, and other network communication protocol chips or near field communication protocol chips, and an infrared receiver. The display apparatus 200 may establish transmission and reception of control signals and data signals with the external control apparatus 100 or the server 400 through the communicator 220.
A user interface for receiving control signals for controlling the apparatus 100 (e.g., an infrared remote control, etc.).
The detector 230 is used to collect signals of the external environment or interaction with the outside. For example, detector 230 includes a light receiver, a sensor for collecting ambient light intensity; alternatively, the detector 230 includes an image collector, such as a camera, which may be used to collect external environment scenes, attributes of the user, or user interaction gestures, or the detector 230 includes a sound collector, such as a microphone, which is used to receive external sounds.
The external device interface 240 may include, but is not limited to, the following: high Definition Multimedia Interface (HDMI), analog or data high definition component input interface (component), composite video input interface (CVBS), USB input interface (USB), RGB port, and the like. The interface may be a composite input/output interface formed by the plurality of interfaces.
The tuner demodulator 210 receives a broadcast television signal through a wired or wireless reception manner, and demodulates an audio/video signal, such as an EPG data signal, from a plurality of wireless or wired broadcast television signals.
In some embodiments, the controller 250 controls the operation of the display device and responds to user operations through various software control programs stored in memory. The controller 250 controls the overall operation of the display apparatus 200. For example: in response to receiving a user command for selecting a UI object to be displayed on the display 260, the controller 250 may perform an operation related to the object selected by the user command.
A "user interface" is a media interface for interaction and information exchange between an application or operating system and a user that enables conversion between an internal form of information and a form acceptable to the user. A commonly used presentation form of the User Interface is a Graphical User Interface (GUI), which refers to a User Interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in the display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
As shown in fig. 4, in the embodiment of the present application, the application framework layer includes a manager (Managers), a Content Provider (Content Provider), and the like, where the manager includes at least one of the following modules: an Activity Manager (Activity Manager) is used for interacting with all activities running in the system; the Location Manager (Location Manager) is used for providing the system service or application with the access of the system Location service; a Package Manager (Package Manager) for retrieving various information related to an application Package currently installed on the device; a Notification Manager (Notification Manager) for controlling display and clearing of Notification messages; a Window Manager (Window Manager) is used to manage icons, windows, toolbars, wallpapers, and desktop components on a user interface.
In some embodiments, the kernel layer is a layer between hardware and software. As shown in fig. 4, the core layer includes at least one of the following drivers: audio drive, display driver, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (like fingerprint sensor, temperature sensor, pressure sensor etc.) and power drive etc..
The embodiment of the application can be applied to various types of display devices and VR devices (including but not limited to smart televisions, liquid crystal televisions, VR glasses, VR head displays and the like). The following description will be made of a display device, VR devices, and a display control method of virtual reality application content, taking the display device and VR glasses as examples.
FIG. 5A is a logic diagram illustrating displaying different types of application content according to an embodiment of the present application.
In some embodiments, when the first controller of the display device detects VR glasses connection, the first controller will detect, determine application types for foreground applications in the user interface, which may include a first display format type, and a second display format type.
The first display format type may be implemented as a virtual reality application type, such as a 3D application; the second display format type may be implemented as a flat display application type, such as a 2D application; for example, after the VR glasses are connected to the display device, the first controller first obtains basic information of a foreground application in the user interface, where the basic information includes basic information that the application is a virtual reality application type or a flat panel display application type; it should be noted that, when there are multiple applications in the user interface running, the first controller will first obtain the basic information of the foreground application, and for other background applications, the first controller will not obtain the basic information of the foreground application at the first time.
When the foreground application is of a virtual reality application type, the first controller acquires an application background image of the foreground application through a game engine component (Unity SDK), wherein the application background image is an image which can be displayed on a user interface when a user is not loaded in the running process of the virtual reality application.
It should be noted that the display device provided by the present application includes an image acquisition interface, where the image acquisition interface is used to acquire a user image and an environment image within a detection range; the image acquisition interface may in particular be configured to acquire images by connecting a camera, such as a 3D camera, or a depth camera; the camera is externally connected to the image acquisition interface, and in some embodiments, may also be connected to the image acquisition interface in a manner of being embedded in the display device housing.
Taking an example that the image acquisition interface is connected to the depth camera, in the process of acquiring the application background image, the first controller simultaneously acquires the user image acquired by the depth camera in the current scene, and the first controller can acquire the real-time user image for operating the virtual reality application by calculating texture data acquired by the left camera and the right camera.
After the application background image and the user image are obtained, the first controller performs unified coding and transmits the coded data to the VR glasses at the opposite end through the network; in order to realize a high-speed streaming data service, data transmission can be performed by using the double link transmission scheme provided in the specification of the present application, so as to reduce data delay.
And after the VR glasses receive the data sent by the display equipment, the second controller respectively renders the application background image and the user image, and then synthesizes the application background image and the user image to realize the 3D display at the VR glasses end. In some embodiments, when the user rotates or performs other control operations with the VR glasses, the second controller will transmit corresponding control data to the display device, where the control data includes sensor data collected by sensors configured with the VR glasses, and the sensor data may specifically include sensor data generated by the user operating the VR glasses user interface, or sensor data generated by the user operating other peripherals of the VR glasses, such as a joystick or the like, as shown in fig. 5A.
In some embodiments, after the display device receives the control data sent by the VR glasses, the first controller transmits the control data to the corresponding foreground application, so as to control the foreground application by the VR glasses.
The display equipment is used as a large-screen display terminal and generally has professional image quality processing capacity; VR glasses, as mobile devices, can display video and audio content, but generally do not have strong image quality processing capability;
therefore, the control data generated by the VR glasses are processed at the display equipment end and then fed back to the VR glasses, so that the hardware configuration and the computing capacity of the display equipment can be fully utilized, and the interaction efficiency and the display effect of the VR glasses can be improved.
Through above-mentioned technical scheme, still can realize the synchronous display of display device, VR glasses, the user can watch the 2D picture that the big screen display of display device shows and the 3D picture that VR glasses show simultaneously, can realize the linkage of display device and VR glasses.
In some embodiments, the VR glasses are connected to the display device, and at the moment when the foreground application is the type of flat display application, the first controller will encode the display data at the display device end and transmit it to the VR glasses.
Firstly, a first controller creates an encoder and sets screen recording parameters for recording a display picture of a user interface; then, recording a display picture of a user interface in the running process of the planar display foreground application; based on the created encoder, encoding and transmitting the display picture to VR glasses in a streaming communication manner; the streaming communication is an instant video and audio playing technology, which is different from the traditional video and audio playing modes such as M PEG or M P3, etc., and is characterized in that the streaming communication can be played and transmitted simultaneously, the transmission waiting time can be reduced, a large amount of storage space is not needed for storing media files, and the transmission playing can be realized only by a proper amount of storage space.
And after receiving the encoded display image data, the VR glasses correspondingly decode the encoded display image data so as to realize that the VR glasses are used for watching the flat display application content of the display equipment. It can be found that, when the foreground application is a flat panel display application type, the display device records the display image of the foreground application in the user interface, and transmits the display image to the VR glasses for 2D display.
It can be found that, for the problem that the user interface of some display devices is displayed in 2D no matter the first display format type or the second display format type is operated, the technical scheme and the device provided by the application can realize streaming of the display device picture to the VR glasses, implement different display schemes according to the application types, implement 2D display for the VR glasses for the flat display application, and implement 3D display for the VR glasses for the virtual reality application.
Fig. 5B is a system frame diagram illustrating that virtual reality application content is displayed on VR glasses in another embodiment of the present application.
In some embodiments, during the process of displaying the virtual reality application content of the display device on the VR glasses, the display device first encodes the generated audio and video; then, the encoded data is transmitted to VR glasses through system framework service; the received data is then decoded by VR glasses and played.
In the process that the VR glasses control the virtual reality foreground application, sensor data acquired by the VR glasses can be fed back to the display equipment again through the system framework service; the display device receives control data containing sensor data, controls display content of foreground application in the user interface based on the control data, and achieves scene display change of the virtual reality foreground application in a screen of the display device.
For example, foreground applications and users generate audio and video data on the display device side; the video part is firstly transmitted to a video coding and decoding module through a media coder and decoder interface; then, the video stream capable of being displayed is generated through an image quality processing unit; the audio part is processed by an audio coding and decoding module through an audio track channel and then is output as a playable audio stream through an audio mixer;
after the video stream and the audio stream are subjected to sound and picture synchronization processing again, the video stream and the audio stream are transmitted to VR glasses in a streaming communication mode; the VR glasses receive the streaming data through the system framework service and control the virtual content player to play the transmission data; similarly, after the VR glasses obtain sensor data such as handles, the sensor data will also be fed back to the display device, and the data will be further distributed to foreground application.
In some embodiments, when the VR glasses are connected to the display device and the foreground application is a type of flat panel display application, the process of recording the user interface display image by the first controller may be implemented as follows.
Loading a screen recording database at a display equipment terminal, wherein the screen recording database comprises preset screen recording parameters; then, calling a system screen recording interface to acquire an audio stream and a video stream; after the obtained data is obtained, the first controller independently encodes and transmits the audio stream and the video stream. The above process of acquiring and sending data streams actually cyclically acquires the audio stream and video stream data encoded by the encoder for the first controller, and then sends the data streams to the VR glasses in a streaming manner. In some embodiments, during the streaming of the audio stream and the video stream, if a stop command is received, the first controller will stop the recording process, and the logic diagram thereof is as shown in fig. 5C.
In some embodiments, in the screen recording process, the first controller acquires a user interface display image at a screen composition unit of the system as screen recording data, that is, an application user interface of the display device and a physical channel image are synthesized at the screen composition unit to form the user interface display image.
For example, in the application interface display process, data such as a visual Layer user interface, a window user interface, a component user interface, etc. are generally transmitted to a display composition system Layer (surface flunger) to compose a displayable Layer (Layer);
the displayable Layer (Layer) data is then transmitted to a frame buffer (frame buffer) to obtain buffered data, which is then transmitted to a picture composition unit (GOP) for picture composition.
For the transmission process of the physical channel image, the physical channel can comprise HDMI, DTV, ATV and other channels; firstly, selecting a corresponding video coding and decoding module to process physical channel data, and then performing data processing through a time domain jitter processing unit (FRC), a middle communication part (JNR) and a picture quality processing unit (MVOP) to improve picture quality and tone quality; then it is transmitted to a picture composition unit (GOP) where the first controller will acquire screen recording data, said picture composition unit being capable of implementing the functions of composing the application user interface and the physical channel image; the data processed by the frame unit is transmitted to the screen Timing Controller (TCON) for processing and directly displayed on the screen (PANEL), as shown in fig. 5D.
As can be seen from the above description of the technical solution, if screen recording data is acquired at a display composition system layer (surface flicker), the first controller cannot record display content in a physical channel; according to the display equipment and the technical scheme, the screen recording interface is arranged at the mixed position of the application user interface and the physical channel image, so that the application user interface data and the physical channel image data can be simultaneously taken, the physical channel data are subjected to image quality processing at the moment, and the display effect is better.
In some embodiments, when the VR glasses are connected to the display device and the foreground application is a flat panel display application, the specific process of transmitting the audio stream and the video stream generated by the display device to the VR glasses through serial communication may be implemented as follows.
Taking the H264 format video stream and the pulse code modulation audio stream as examples for explanation, the screen recording service of the display device will implement screen recording on the mixed user interface display image through the screen recording interface;
after obtaining the video stream and the audio stream, the first controller encapsulates the video stream and the audio stream into a Transport Stream (TS), and transmits the TS to the VR glasses at the opposite end through the system framework service; the video part of the obtained audio stream and video stream which are processed by the audio and video quality can support video recording with the highest 1080P and 30HZ, and the audio part of the obtained audio stream and video stream can support 16-bit pulse code modulation, and can be converted into Advanced Audio Coding (AAC) and video stream to be sent to a mixer for packaging;
after the streaming service of the VR glasses receives the data, the second controller separates the audio stream and the video stream contained in the data, and the video stream is split and rendered by the depth image processing, and then played by the player, as shown in fig. 5E.
In some embodiments, after the VR glasses are connected to the display device, when the foreground application is a virtual reality application type 3D application, the user interface loads and displays the 3D application information, and the first controller acquires an application background image of the foreground application through the game engine component;
for a user image needing to be synthesized with an application background image, a first controller acquires the user image from a depth camera, wherein the depth camera can comprise a left camera and a right camera and can be used for shooting texture data; then, the application background image and the texture data are encoded and transmitted to the VR glasses through the network for rendering and displaying, as shown in fig. 5F;
taking a display device running a 3D application as an example, a first controller first acquires an application background image of the 3D application and left and right camera images acquired by a depth camera, where the left and right camera images include user images; after the image data are acquired, in order to improve the transmission reliability and efficiency, the first controller encodes the image data and transmits the encoded data to VR glasses through an accessed network; receiving the coded data by opposite-end VR glasses, and performing corresponding decoding processing on the coded data; and then, performing left-right eye visual effect processing on the left and right camera images through the playing software, synthesizing the left and right camera images with the application background image, rendering texture data of the application background image, and finally realizing playing and displaying of a 3D picture, wherein the whole scheme framework is shown in fig. 5G.
In some embodiments, the VR glasses are connected to the display device, and when the foreground application is the virtual reality application type, the user can control the virtual reality foreground application of the display device through the VR glasses, and the virtual reality foreground application can be controlled through the following scheme.
Firstly, when a VR glasses sensor detects user operation, a detection service transmits acquired current sensor data to a detection service client in an interprocess communication mode;
then, based on the sensor data and timestamp information in the data production process, the second controller obtains predicted sensor data generated by VR glasses in the transmission delay process of the current sensor data based on a prediction algorithm;
then, the VR glasses transmit current sensor data and predicted sensor data to the display device through system framework service, and the current sensor data and the predicted sensor data form control data for controlling virtual reality foreground application, and the control data can be also called as quaternary data; the display device will control the virtual reality foreground application to display content in the user interface based on the current sensor data and the predicted sensor data, as shown in fig. 5H.
The following describes a technical scheme for realizing high-speed streaming communication based on a multilink network transmission technology in the process of displaying 3D application content operated by a smart television on VR glasses.
In some embodiments, when the display device is playing a live channel or playing a video program through an installed application, the live channel or the video program may be played in a foreground user interface. If the user starts a plurality of application programs for video playing, the plurality of application programs can be switched into background running if the plurality of application programs are not played and displayed on the user interface in the foreground.
In some embodiments, after the display device launches the first application, the controller of the display device controls the launched first application to be played and displayed on the foreground user interface.
When the display equipment has the condition of accessing a plurality of network environments, the controller controls the first application on the foreground user interface to carry out data transmission through the first network link.
Correspondingly, the display device can access a plurality of network environments, and the display device can be configured with a corresponding network card for each network environment which can be accessed simultaneously. For example, a television can communicate with a wide area network through a first network link and a second network link, and only the television can be configured with a corresponding first network card and a corresponding second network card, where the first network card is dedicated for data transmission of the first network link and the second network card is dedicated for data transmission of the second network link. The application programs that the display device has installed for video playback include a first video application, a second video application, and a third video application.
For example, when the display device only starts the first video application, the controller controls the first network card to drive the corresponding first network link, and provides data communication for the first application of which the foreground user interface is playing the video program. It can also be understood that the first application playing the program on the foreground user interface at this moment performs network communication through the first network card of the display device.
In some embodiments, when the display device simultaneously launches multiple playable applications, the controller will identify the application playing the program in the foreground user interface.
For example, the user operates the second video application through a television desktop; after the second video application is started, the first video application is switched to the background to continue running, and at the moment, the first and second video applications are played and displayed on a foreground user interface. And the controller of the display equipment determines the current application of the program in the foreground after receiving the instruction of starting the second video application by the user.
In some embodiments, after the display device starts the first application and the second application, if the display device foreground user interface displays the second application, the controller controls the plurality of networks available to the television to provide data transmission services for the first application and the second application, respectively.
For example, the network usable by the display device includes a first network and a second network, where the first network corresponds to a first network card and the second network corresponds to a second network card; the first controller can configure the first network and the corresponding first network card to serve the application playing the program in the foreground user interface fixedly, and configure the second network and the second network card to serve the application which is switched into the background to continue running and needs network communication fixedly.
When the first video application is played on a foreground user interface, a first network and a first network card are used for carrying out network communication support on the first video application; when a user starts the second video application and enables the second video application to be played on the foreground user interface, the controller performs network switching control; the controller switches the first video application to the background to continue running, controls the second video application to carry out network data communication from the first network and the first network card, and controls the first video application switched into the background to carry out network data communication from the second network and the second network card.
It should be noted that, in general, a user may allocate a plurality of available networks to a plurality of network cards of a television. For example, for 2 available networks for televisions and 2 network cards, a first network link with relatively good network quality can be allocated and accessed to the first network card, and another second network link is allocated and accessed to the second network card, so that a currently most important broadcast program which is currently watched by a user and is in the foreground can acquire data through the fixed proprietary first network link, and poor network experience such as program broadcast jam caused by insufficient network bandwidth due to simultaneous starting of multiple applications is avoided.
In some embodiments, the display device controller may control the second application to bypass the protocol stack for network communication through the first network card while the foreground user interface plays the second video application.
For example, after a first video application and a second video application are started, the controller acquires a first index corresponding to the first network card and a second index corresponding to the second network card from a system kernel of the display device based on the name of the first network card and the name of the second network card according to network attributes recorded when the first network link and the second network link are started;
before the video application respectively transmits network data, the controller judges an application program in a foreground user interface according to the system attribute field;
if the second video application is played in the foreground, the controller controls to bypass the protocol stack based on the first index and support the second video application to carry out network communication through the first network card, for example, the first index corresponding to the first network card is obtained through the attribute field, then the protocol stack is bypassed based on the first index, and the network data received and sent by the second application is transmitted to the first network card through the data interface.
Correspondingly, for the first video application running in the background, the controller bypasses the protocol stack based on the second index to perform network communication through the second network card, and acquires the second index corresponding to the second network card through the attribute field, and then bypasses the protocol stack based on the second index to transmit the network data received and transmitted by the first video application to the second network card through the data interface, as shown in fig. 6B.
It should be noted that the data request sent by the first network card to the wan receives the corresponding data reply through the first network card; and other applications transmit and receive data through the second network card, so that smooth network transmission of foreground applications can be ensured.
In some embodiments, when the first network link and the second network link are enabled, the controller may obtain indexes corresponding to the first network card and the second network card, so as to position different network cards according to different network card indexes for data transceiving.
When a plurality of applications are started, the controller creates a temporary interface in the system and then sends a signaling to a system kernel through an input and output control instruction; the input/output control instruction is mainly used for communication between an application program and a driver or communication between drivers in an equipment stack, and the instruction is transmitted by a message through an input/output request;
when the first network card is configured as a wired network card and the second network card is configured as a wireless network card, the controller can acquire a first index corresponding to the wired network card from the kernel through the name of the wired network card and acquire a second index corresponding to the wireless network card from the kernel through the name of the wireless network card;
after the indexes corresponding to the multiple network cards are obtained, the controller closes the temporary data interface for sending the signaling, and sets the indexes of the first network card and the second network card to different android system attribute fields respectively, so as to facilitate obtaining during subsequent data transmission, as shown in fig. 6A.
In some embodiments, after the display device starts the plurality of applications, before the controller controls different network cards to perform network support on the different applications, the controller acquires a first network card address according to a first network card service of a corresponding first network link when the first application plays a program on a foreground user interface; and after the second application is started, the controller acquires the address of the second network card corresponding to the second network link according to the connection service of the android frame process, so that different applications of the display device can perform network communication through different network cards.
For example, in the display device, when the video application plays a program on the foreground user interface, the controller obtains the IP address corresponding to the network card through the network card service that can correspond at this time; when foreground application and background application exist simultaneously, a network card used by the background application can acquire a corresponding IP address through an android frame process, and finally, the corresponding IP address is written into an inner core and the network card of an android system through a connection service of the android system, so that the android television can be ensured to be simultaneously connected with two different networks and acquire the two IP addresses.
For another example, after a plurality of televisions in the home environment are successfully networked, images collected by different television cameras can be fused and displayed on one television screen through the network card application; after the two televisions are networked, one television can transmit the video stream of the camera to the other television through the local area network for display, and at the moment, if the two televisions can transmit data simultaneously, the transmission speed can be improved.
In some embodiments, if the user switches the first video application back to the foreground user interface, the controller performs network communication on the second video application that is controlled to be switched into the background by bypassing the protocol stack through the second network link corresponding to the second network card, that is, the network used by the second video application is changed from the first network in the foreground to the second network link in the background; and the network used by the corresponding first video application played on the foreground user interface at present is switched to a first network link corresponding to the first network card.
It should be noted that, in general, when the second application is displayed on the foreground user interface, the first application is switched to the background operation; in some embodiments, the television may also be configured such that when the second application is switched from the background to the foreground user interface for playing, the first application may also be simultaneously played and displayed on the foreground user interface.
In some embodiments, when a plurality of video applications of the display device are started, the controller may immediately acquire the application program displayed in the foreground user interface.
For example, a first application plays in a foreground user interface, and when a second application is launched, the controller will trigger the active lifecycle of the second application, including creating, launching, and rendering foreground visuals. The starting operation is usually that the user presses a main page key, after the activity becomes a background, the user calls a restart function first and then calls a start function when switching back to the activity.
When the activity stack of the android frame layer is returned to the function foreground in the visible state to realize visible recovery locking, the controller can obtain an activity record at the moment, and the activity record can extract the name of a data packet playing and displaying application in the foreground user interface; and then the controller records the foreground application data packet name into a system attribute field of an android so as to enable the controller to call when selecting network cards for different applications.
In some embodiments, an activity generally includes four states; the running state represents that after a new activity is started and stacked, the new activity is at the forefront of a screen and at the topmost end of a stack, and at the moment, the new activity is in an activated loading state which is visible and can be interacted with a user; a pause state when the activity is covered by another transparent or dialog style activity, while it remains connected to the window manager, the system continues to maintain its internal state, which is still visible, but which has lost focus and is therefore not available for interaction with the user; a stop state, when the activity is invisible, the activity is in the stop state, when the activity is in the stop state, the current data and the current user interface state must be stored, otherwise, once the activity exits or is closed, the current data and the current user interface state are lost; a terminated state, after the activity is killed or before it is launched, in which the activity has been removed from the activity stack and needs to be restarted for display and use.
In some embodiments, the currently launched applications of the display device include a first video application, a second video application; after the user sends a starting instruction to the third video application on the application desktop through the remote controller, the controller determines to determine that the application program in the foreground user interface playing display in the 3 applications is started.
If the third video application is currently played and displayed on the foreground user interface, as in the above technical scheme, the controller controls the third application to bypass the protocol stack and perform network communication and data receiving and transmitting through the first network card; meanwhile, the controller controls the second network card and a corresponding second network link thereof, and performs network communication support on the first video application and the second video application in the background.
It can be understood that when multiple applications are started, the application program playing the program in the foreground user interface can preferentially perform network communication through the first network link; and applications in the background communicate over the network via a second network link. And usually, the communication quality of the first network link is higher than that of the second network link, so as to improve the network experience of watching videos for users.
It should be noted that, in some embodiments, the television is configured to be capable of simultaneously displaying and playing multiple applications on the foreground user interface, at this time, the multiple applications capable of playing programs on the foreground may all use the first network link corresponding to the first network card, and the first network link is not limited to only support 1 foreground application.
In some embodiments, where the display device may use multiple networks, the first network link may be implemented as a wired network link and the second network link may be implemented as a wireless network link.
For example, when the second application is switched to the foreground for playing a program, the controller controls the second application to bypass the protocol stack and perform network communication through a wired network card corresponding to the wired network; meanwhile, the network communication of the first application is controlled to be switched to the wireless network card corresponding to the wireless network for network communication, and the first application bypasses the protocol stack and receives and transmits data through the wireless network card.
For another example, the display device controller controls a third video application played on the foreground user interface to bypass the protocol stack and perform network communication through a wired network card corresponding to the wired network; meanwhile, the controller controls the network communication of the first video application and the second video application running in the background to be switched to the wireless network card corresponding to the wireless network for network communication, and the first video application and the second video application need to bypass the protocol stack and communicate with the wide area network through the wireless network card to receive and send data.
In some embodiments, the first network card and the second network card may also be implemented as different links for accessing the same lan.
For example, a living room is provided with 2 network ports, namely a living room network port 1 and a living room network port 2, which are used for accessing the same network; the first network card is connected to the living room network port 1 through the network cable, and the second network card is connected to the living room network port 2, so that a wired link 1 connected with the first network card and a wired link 2 connected with the second network card are formed.
It can be understood that if the quality of the connection line between the living room portal 2 and the router is poor, which results in the decrease of the network bandwidth, and the quality of the connection line between the living room portal 1 and the router is normal, the television accesses the network through the wired link 1 and the wired link 2, which will generate different network experiences;
under the condition, the third video application played and displayed on the foreground user interface by the display equipment is accessed to the local area network through the first network card, the wired link 1 and the living room network port 1, and the used network bandwidth cannot be influenced and reduced due to the defect quality line connected with the living room network port 2; the first video application and the second video application running in the background are accessed to the local area network through the second network card, the wired link 2 and the living room network port 2, and the used network bandwidth is affected and reduced due to the defect quality line connected with the living room network port 2.
Therefore, the above technical solution enables the android system television to break through the current technical bottleneck, and multiple network accesses can be used at the same time, for example, the android system television can simultaneously access a wired network and a WIFI network, or simultaneously access two broadband networks existing in a home environment, or simultaneously access from different network portals of the same network, so as to avoid the problem of insufficient network bandwidth or network bandwidth reduction caused by the line quality of a specific network portal due to the fact that multiple applications simultaneously use the network.
FIG. 7A shows a schematic diagram of a user interface for displaying device data according to an embodiment of the present application.
In some embodiments, the display device may include a plurality of network cable interfaces, a wireless network card, and a USB port, each network cable interface corresponding to a corresponding wired ethernet card configured for the display device, so that the display device may access different wired network links.
For example, the display device may be provided with a plurality of network cable interfaces and a plurality of USB ports, and the network cable interfaces and the USB ports may be provided at the bottom of the display device body, for example; of course, the schematic diagram of the display device shown in fig. 7A does not limit the network cable interface to be disposed at a specific position of the display device, and the network cable interface and the USB port may also be disposed at the same side, upper portion, or back of the display device housing;
it should be noted that, when an available wired network is inserted into a first network cable interface of the multiple network cable interfaces, it means that a first network card corresponding to the first network cable interface is accessed to the wired network, and for convenience of understanding, the network cable interface corresponding to the network card in fig. 7A is labeled as the network card for convenience of description, and is not described in detail below.
The display device display may display a user interface that may display a live channel of the display device, or run to display installed application content, for example, its desktop applications may include a camera fusion display application, a screen projection application, an a application, a weather forecast, and a number of different video applications, and the like. It should be noted that, when the user starts a specific desktop application, the first network card is connected to an available network, and the USB port is connected to another networking device, the controller of the display device may control the specific application program to transmit the communication data of the specific application program by using the 2 network links at the same time, so as to break through the bandwidth and speed limitations when the data is transmitted by using a single network conventionally.
For example, when the user interface of the display device runs an application a, the controller may control the application to simultaneously transmit and receive data through the network link corresponding to the first network card and the USB link corresponding to the USB port.
It can be understood that, assuming that when the application a is used to transmit data by using a network link corresponding to the first network card alone, the first network card may be configured as a wired network card or a wireless network card, and when the bandwidth of the network link is low, and the data sent by the display device is received by the VR glasses at the opposite end and displayed on the user interface, a phenomenon of stuttering of playing pictures may occur, which is caused by the fact that the network transmission speed cannot meet the high-rate transmission requirement of the VR glasses.
And when the display device that this application provided realizes that above-mentioned data transmission shows, then through display device network card and USB port, wait to transmit data with display device and transmit through 2 links simultaneously, then its transmission bandwidth can improve than single file network, and corresponding VR glasses also improve when the single file network in unit interval received data volume, can satisfy high-speed, the high flow data demand of opposite end VR glasses broadcast picture to a certain extent, avoids the picture card to take place.
It should be noted that the display device may further be configured with a virtual network card for transferring data, where the virtual network card may receive data from the application program and forward the data to the network card and the USB port of the display device for transmission, or receive data transmitted from the network card and the USB port of the display device and feed back the data to the specific application program.
It should be noted that the network links described in the present application may include different local area networks, private line networks, and service networks; it can be understood that the display device provided by the application can realize that the access scene is simultaneously accessed to the wired network link and the USB link, or simultaneously accessed to the wireless network link and the USB link, so as to realize simultaneous network communication.
Fig. 7B shows a schematic diagram of a display device, VR glasses, and a user interface for data transmission according to another embodiment of the present application.
In some embodiments, the display devices provided by the present application may be networked with each other, so as to achieve the effect of speeding up transmission by aggregating the USB link and the network link between the devices provided by the present application.
For example, a display device and VR glasses for data transmission application in a home may be configured with a network card for accessing a network and a USB port for transmitting data.
In some embodiments, the display device is configured with a first network card and a first USB port, the VR glasses are configured with a second network card and a second USB port, and the available networks in the environment include types of wired network links and wireless network links.
Firstly, controlling the display equipment to access the display equipment to a wired network link through a first network card; then controlling the VR glasses to access a wired network link through a second network card; and then connecting a first USB port of the display device with a second USB port of the VR glasses through a USB cable, and finally establishing 2 communication links between the display device and the VR glasses, wherein the display device can send data to opposite-end equipment VR glasses where a second network card is located through a wired network link corresponding to a first network card, and can also send data to opposite-end equipment VR glasses where a second USB port is located through a USB link corresponding to a first USB port, and an aggregation USB link and a network link transmission acceleration framework facing a local area network between the devices are established.
In some embodiments, the display device controller will initiate the data transfer mechanism upon receiving signaling to display the user interface running application content on the peer device.
For example, when the display device user interface runs the application a, when the user does not start the function displayed on the VR glasses, the VR glasses of the opposite device do not display the three-dimensional picture of the application content, and the display device user interface displays and plays only in the two-dimensional mode.
If the user starts the function of displaying on the VR glasses at the display device side or the opposite side, and the display device controller receives the signaling for displaying the video resource on the VR glasses of the opposite side device, the controller starts the aggregation link data transmission mechanism to perform high-speed data transmission, and after the data is transmitted to the VR glasses, the user interface of the VR glasses of the opposite side device is as shown in fig. 7B.
Firstly, the controller sends video resource transmission data to be transmitted from an application A to an established virtual network card, wherein the application A can also be called as sending end application; then, after the video resource data passes through the virtual network card, the controller splits the video resource data, for example, the video resource data can be split into a first data set and a second data set; it can be understood that video resource data to be transmitted need to be transmitted to VR glasses of an opposite device through 2 different links, and after splitting the video resource data, the video resource data can be transmitted from different links at the same time.
The controller of the display equipment controls the split first data set to perform data transmission through a first network card, the first data set is transmitted to a second network card corresponding to the VR eye through a wired network link, and the second data set is transmitted to a second USB port corresponding to the VR glasses of the opposite-end equipment through a USB link;
finally, the first data set and the second data set transmitted to the VR glasses are subjected to data integration and restoration again to obtain video resources during transmission under the control of the VR glasses controller of the peer device, so as to complete display and play on a user interface of the VR glasses, and a user interface and a network architecture of the display device are shown in fig. 7B, where the user interface and the network architecture are implemented by transmitting the application content a to the user interface of the VR glasses of the peer device through an aggregation link at a high speed.
In some embodiments, the display device controller also creates a data processing service, which may also be referred to as a data processing center, in the display device system for data interaction with the virtual network card for data splitting or data integration.
For example, when the display device sends data, the data packet that needs to be transmitted to the VR glasses reaches the virtual network card through the network protocol stack, the created data processing service reads the transmission data of the virtual network card node, and then splits the video resource data packet into a first data set network packet 1 and a second data set network packet 2 based on the sequence number included in the message format and the ratio of the determined data distribution ratio of the first data set and the second data set according to the transmission capacity ratio of the network link and the USB link;
dividing data packets contained in the video resources into a network packet 1 and a network packet 2 according to the transmission rate of the network card 1 and the transmission rate of the USB channel; the network packet 1 is sent to the network card 1 through a socket (data interface), and the network packet 2 is transmitted to the Mini USB port through the data part of the USB protocol.
In some embodiments, the network packets included in the second data set are encapsulated into USB packets adapted to be transmitted by the USB link during transmission.
Firstly, before a second data set is sent from a first USB port of the display equipment, a controller controls a network data head of a network data packet and network data to be integrally packaged into USB data; and then the encapsulated USB data is combined with the corresponding USB data head to form a USB data packet, and the controller controls the USB data packet to be transmitted to the VR glasses through the USB link.
It can be understood that when the second data set is transmitted through the USB channel, the display device encapsulates the network header and the network data into the data portion of the USB data packet, and then sends the data portion to the USB port of the VR glasses via the USB transmission protocol.
Secondly, after the VR glasses end receives the USB data packet through the USB link, the USB data contained in the VR data packet is unpacked to obtain and restore the network data packet corresponding to the second data set, and then the network data packet is removed from the network head to obtain the network data corresponding to the video resource.
It can be understood that after the VR glasses receive the USB data, the data portion of the USB data is extracted as network data; after the data is unpacked, the network data is transmitted through the USB link and the network link at the same time, and the data is transmitted in a link aggregation mode.
Fig. 7C is a schematic diagram illustrating an architecture and a user interface of a display device and VR glasses for data transmission according to another embodiment of the present application.
In some embodiments, the display device to be transmitted with data and the VR glasses can be configured to transmit data through the USB link and the wireless network link at the same time.
For example, the display device is configured with a first wireless network card, and VR glasses of the opposite-end device are configured with a second wireless network card; the user controls the display equipment to access a wireless network link through a first wireless network card, and controls the VR glasses to access the same wireless network link through a second wireless network card; and then the first USB port of the display device is connected with the second USB port of the VR glasses through a USB cable, and finally 2 communicable links can be established between the display device and the VR glasses.
The display device can send data to the VR glasses of the opposite terminal device where the second network card is located through the wireless network link corresponding to the first network card, and can also send data to the USB port of the VR glasses of the opposite terminal device through the USB link accessed by the display device, so that double-link aggregated data transmission between the devices can be established.
It is understood that in some embodiments, for a display device configured with 2 wired network cards and 1 wireless network card, 1 network card may be optionally selected for local area network configuration.
Fig. 7D is a schematic diagram illustrating an architecture and a user interface for data transmission between display devices according to another embodiment of the present application.
In some embodiments, the current display device may be further configured to receive a screen-casting display request from the peer device, which transmits a network link that may correspond to the first network card via the USB link, where the first network card may be implemented as a wired network card or a wireless network card. When the display device receives data, the data processing service writes the data received by the first network card and the first USB port into the virtual network card after the data heads are respectively removed.
For example, when receiving a signaling sent by the peer device for displaying a video resource generated by the peer device, the display device displays, on a user interface of the current display device, the first video application content that the peer device is displaying and playing.
When receiving the screen projection signaling, the controller integrates data packets for controlling the opposite terminal equipment to send to a first wireless network card and a first USB port of the display equipment; as described above, at this time, the data packets received by the network card of the display device may be implemented as the first data set and the second data set, and the controller controls the received data to be integrated and then sent to the established virtual network card, and then the integrated data is uniformly transmitted to the video application corresponding to the display device, so that the user interface of the display device displays the first application content being played by the peer device.
The first data set and the second data set received by the display device respectively reach a first wireless network card and a first USB port, then the created data processing service monitors the read network data packet, and removes the IP heads of the first network card and the second network card from the read data; the controller controls the USB protocol stack to remove the USB data head from the received USB data packet, and then network data is generated through the USB service again so as to be sent to the virtual network card for transmission; and the video resource is written into the created virtual network card and is transmitted to a network protocol stack to reach the target application of the receiving end.
It should be noted that, in some embodiments, when the display device receives and displays the screen projection content, the display device may also be configured to use some general applications for playing and displaying, and is not limited to the application used for the same application for the opposite device.
It can be understood that when the display device receives data, the data sent by the display device is similar to the data sent by the display device, and the application content data generated by the opposite device also needs to be split and then is simultaneously transmitted to the current display device through the network link and the USB link.
It can be understood that when the network card bandwidth supporting capability of the display device is low and the bandwidth of the local area network is high, the networking mode among the devices can break through the bottleneck limit of the transmission bandwidth; and under the scene that the transmission bandwidth and the transmission speed are affected by the defects of the quality of the optical cable and the coaxial cable used by the first network link, the bottleneck limit of the transmission bandwidth and the transmission speed can be broken through by the networking mode among the devices, and different network experiences can be generated for users.
Therefore, the technical scheme enables the display device to break through the current technical bottleneck, and can use a plurality of transmission links at the same time, thereby avoiding the problems of insufficient network bandwidth or network bandwidth reduction caused by the fact that a plurality of applications use the network at the same time, or the quality of a specific network inlet line.
It can be understood that a plurality of network cards and USB ports of the display device can be connected with a plurality of transmission links, so that the problem of link network bandwidth reduction caused by poor quality of partial network lines in a family can be avoided, and the network links used by important applications are ensured to be in a good state.
In some embodiments, taking application a as an example to describe the technical scheme of the present application as a whole, the display device controller may create a virtual network card and modify a routing policy, so that all network data are transmitted and transferred through the virtual network card; after data passes through the virtual network card, the data is processed again through the data processing service, the data splitting ratio is determined according to the ratio of the speed of the USB link to the transmission speed of the network link supported by the network card, for example, the transmission of the USB link can reach 500M/s, and the transmission of the network link can reach 100M/s, then the controller can control the splitting and distribution of 5 data packets in the first 6 data packets to the USB port, and the splitting and distribution of the remaining 1 data packet to the network card;
when the data to be transmitted is sent to the USB port, the USB data processing described above is performed, and the network data to be transmitted is encapsulated into USB data that can be transmitted through the USB link; the controller divides the data into a network card data packet and a USB data packet, the data of the network card data packet is sent to the network card, and the data of the USB data packet is sent to the USB interface; the data is transmitted through a local area network, the data of a network card of the display device reaches a network card of VR glasses, the USB data of the display device reaches a USB port of the VR glasses through a USB link, and data processing service is also established in a system of the VR glasses; after processing the received transmission data, the data processing service of the VR glasses extracts the data of the network card, cuts off the ip head of the network card and sends the data to the virtual network card, meanwhile, the USB service of the VR glasses sends the data with the USB head removed to the virtual network card, and then the virtual network card reaches the VR glasses through the network protocol stack for display.
In some embodiments, based on the data transmission between the VR glasses and the display device in the above embodiments, the technical solution of the present application may also be applied to data transmission between the VR glasses and the computer device and display application.
For example, a VR game can be executed on an intelligent computer or a computer with strong terminal processing capability, and the display streaming of the display device or the computer can be transmitted to VR glasses for display in a high-speed and real-time manner through the data transmission technology based on the USB dual link provided by the application; meanwhile, the operation data of the user at the VR glasses end can also be sent to the display device or the computer at a high speed, so that the display device or the computer performs corresponding calculation processing and display change according to the operation data at the VR glasses end, as shown in fig. 7E.
It can be understood that, based on the high-speed transmission technical solution provided by the present application, the operation data at the VR glasses end can be transmitted to the display device or the computer device with higher operation capability under the low-latency standard, and then the display data transmitted by the VR glasses end is received from the display device or the computer device, so that the resource integration between the devices is realized by using the dual-link high-speed networking transmission technology based on the USB link.
It should be noted that, although the transmission technical solution provided by the present application is described by taking VR glasses and display devices as examples, the VR glasses may also be implemented as other VR devices or smart devices according to actual situations, and the present application does not specifically limit the VR glasses.
Based on the above description of the display control scheme and the related drawings for realizing the television 3D application content by the display device and the VR glasses, the application also provides a display control method for the television 3D application content. The specific steps of the method for realizing the display control of the 3D application content of the television are explained in detail in the technical solutions of the display device and the VRS device provided above, and are not described herein again.
The embodiment has the advantages that different linkage schemes of the VR equipment and the display equipment can be realized by determining the foreground application type; further, 3D display of the television 3D application content in VR equipment can be realized by acquiring foreground application background images and user images; further through constructing the dynamic interaction of VR equipment and display device, VR equipment can be used to operate foreground application, and synchronous display of display device and VR equipment is realized.
The following paragraphs will comparatively list the Chinese terms referred to in this specification and their corresponding English terms for easy reading and understanding.
The foregoing description, for purposes of explanation, has been presented in conjunction with specific embodiments. However, the foregoing discussion in some embodiments is not intended to be exhaustive or to limit the implementations to the precise forms disclosed above. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles and the practical application, to thereby enable others skilled in the art to best utilize the embodiments and various embodiments with various modifications as are suited to the particular use contemplated.

Claims (11)

1. A display device, comprising:
a display for displaying a user interface;
the image acquisition interface is used for acquiring a user image;
a first controller configured to:
determining an application type of foreground application in the user interface when the VR device is detected to be connected to the display device;
when the application type is a first display format type, acquiring an application background image displayed in the user interface by the foreground application and the user image acquired by the image acquisition interface;
and sending the application background image and the user image to the VR equipment, and rendering the application background image and the user image in the VR equipment to enable the VR equipment to display a 3D picture.
2. The display device of claim 1, wherein the first controller is further configured to:
receiving control data sent by the VR device, wherein the control data comprises sensor data generated by a user operating in a user interface of the VR device and/or sensor data generated by a user operating an external device connected with the VR device, and the external device is connected to the VR device in a wireless or wired mode;
and controlling the display content of the foreground application in the user interface of the display equipment based on the control data.
3. The display device of claim 1, wherein the first controller is further configured to:
and when the application type is a second display format type, recording a display image applied in the user interface by the foreground, encoding the display image and then sending the encoded display image to the VR equipment, wherein the encoded display image is decoded in the VR equipment to play a 2D picture.
4. The display device of claim 3, wherein the first controller is further configured to, in the first controller recording the display image of the foreground application in the user interface:
and acquiring the display image in a frame group layer unit of a display equipment system, and synthesizing an application user interface and a physical channel image of the display equipment into the display image in the frame group layer unit.
5. The display device of claim 1, wherein the first controller is further configured to, in the first controller receiving the control data sent by the VR device:
receiving control data sent by the VR device that includes current sensor data and predicted sensor data, the predicted sensor data being sensor data generated by the VR device during transmission of the current sensor data acquired based on a prediction algorithm.
6. A VR device, comprising:
a second controller configured to:
when the display device is connected with a display device with a foreground application of a first display format type, receiving an application background image of a user interface foreground application and a user image which are sent by the display device, wherein the user image is acquired by an image acquisition interface of the display device;
and controlling a VR device user interface to display a 3D picture generated by rendering according to the application background image and the user image.
7. The VR device of claim 6, wherein the second controller is further configured to:
sending control data including current sensor data and predicted sensor data to the display device, the predicted sensor data being sensor data generated by the VR device during transmission of the current sensor data to the display device by the second controller based on a prediction algorithm.
8. The VR device of claim 7, wherein the second controller is further configured to, in sending control data to the display device:
the control data includes sensor data generated by a user operating in the VR device user interface and/or sensor data generated by a user operating a peripheral connected to the VR device, the peripheral being connected to the VR device wirelessly or by wire.
9. The VR device of claim 6, wherein the second controller is further configured to:
when the foreground application is of a second display format type, receiving a display image which is sent by the display equipment and is coded, wherein the display image is a screen recording image of a user interface of the display equipment;
and decoding the display image, and controlling the VR equipment user interface to play a corresponding 2D picture.
10. A display control method of virtual reality application content, the method comprising:
determining the application type of foreground application when the connection of VR equipment is detected;
when the application type is a first display format type, acquiring an application background image displayed by the foreground application and a user image acquired by an image acquisition interface;
and sending the application background image and the user image to the VR equipment, and rendering the application background image and the user image in the VR equipment so that the VR equipment displays a 3D picture.
11. A display control method of virtual reality application content, the method comprising:
when the display device is connected with a display device of which the foreground application is of a first display format type, receiving an application background image of a user interface foreground application and a user image which are sent by the display device, wherein the user image is acquired by an image acquisition interface of the display device;
and displaying a 3D picture generated by rendering according to the application background image and the user image.
CN202210190102.9A 2021-09-30 2022-02-28 Display device, VR device and display control method of virtual reality application content Pending CN114630101A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111164454 2021-09-30
CN2021111644549 2021-09-30

Publications (1)

Publication Number Publication Date
CN114630101A true CN114630101A (en) 2022-06-14

Family

ID=81899183

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210190102.9A Pending CN114630101A (en) 2021-09-30 2022-02-28 Display device, VR device and display control method of virtual reality application content

Country Status (1)

Country Link
CN (1) CN114630101A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106599770A (en) * 2016-10-20 2017-04-26 江苏清投视讯科技有限公司 Skiing scene display method based on body feeling motion identification and image matting
CN107491934A (en) * 2017-08-04 2017-12-19 国网山东省电力公司 A kind of 3D interview exam systems based on virtual reality
US20180001198A1 (en) * 2016-06-30 2018-01-04 Sony Interactive Entertainment America Llc Using HMD Camera Touch Button to Render Images of a User Captured During Game Play
US20180308288A1 (en) * 2017-04-20 2018-10-25 Samsung Electronics, Co. Ltd. System and method for two dimensional application usage in three dimensional virtual reality environment
CN110505471A (en) * 2019-07-29 2019-11-26 青岛小鸟看看科技有限公司 One kind wearing display equipment and its screen capture method, apparatus
CN111669662A (en) * 2020-07-03 2020-09-15 海信视像科技股份有限公司 Display device, video call method and server

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180001198A1 (en) * 2016-06-30 2018-01-04 Sony Interactive Entertainment America Llc Using HMD Camera Touch Button to Render Images of a User Captured During Game Play
CN106599770A (en) * 2016-10-20 2017-04-26 江苏清投视讯科技有限公司 Skiing scene display method based on body feeling motion identification and image matting
US20180308288A1 (en) * 2017-04-20 2018-10-25 Samsung Electronics, Co. Ltd. System and method for two dimensional application usage in three dimensional virtual reality environment
CN107491934A (en) * 2017-08-04 2017-12-19 国网山东省电力公司 A kind of 3D interview exam systems based on virtual reality
CN110505471A (en) * 2019-07-29 2019-11-26 青岛小鸟看看科技有限公司 One kind wearing display equipment and its screen capture method, apparatus
CN111669662A (en) * 2020-07-03 2020-09-15 海信视像科技股份有限公司 Display device, video call method and server

Similar Documents

Publication Publication Date Title
US9800939B2 (en) Virtual desktop services with available applications customized according to user type
US9300754B2 (en) Information processing system, information processing apparatus, information processing method, and program
CN104159151A (en) Device and method for intercepting and processing of videos on OTT box
WO2020098504A1 (en) Video switching control method and display device
CN109391800B (en) Smart community video monitoring method and system based on broadcast television TVOS smart set top box
CN114302219A (en) Display equipment and variable frame rate display method
CN204013943U (en) A kind of device that carries out video intercepting and process on OTT box
CN115379277B (en) VR panoramic video playing method and system based on IPTV service
KR102542070B1 (en) System and method for providing virtual reality contents based on iptv network
CN115278323A (en) Display device, intelligent device and data processing method
CN114630101A (en) Display device, VR device and display control method of virtual reality application content
KR101405865B1 (en) Method of presentation virtualization of set-top-box, and its system
CN115174991B (en) Display equipment and video playing method
WO2012171156A1 (en) Wireless video streaming using usb connectivity of hd displays
CN111629250A (en) Display device and video playing method
CN116136751B (en) Mirror image method for cross-operating system of primary screen and secondary screen
CN115914694A (en) Display device and network connection method based on USB link
CN117119234A (en) Display equipment and media asset playing method
CN117812341A (en) Display equipment and media asset playing method
CN117651186A (en) Display device, video seamless switching method, and storage medium
CN116939263A (en) Display device, media asset playing device and media asset playing method
CN115914731A (en) Display device and control method for improving network environment based on double links
CN116156238A (en) Display equipment, video playing method and device
CN115604496A (en) Display device, live broadcast channel switching method and storage medium
CN116781918A (en) Data processing method and device for web page real-time communication and display equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination