CN114286137A - Mirror image screen projection method, display device and terminal - Google Patents

Mirror image screen projection method, display device and terminal Download PDF

Info

Publication number
CN114286137A
CN114286137A CN202110969847.0A CN202110969847A CN114286137A CN 114286137 A CN114286137 A CN 114286137A CN 202110969847 A CN202110969847 A CN 202110969847A CN 114286137 A CN114286137 A CN 114286137A
Authority
CN
China
Prior art keywords
display
terminal
screen
video stream
stream data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110969847.0A
Other languages
Chinese (zh)
Inventor
刘美玉
武支友
肖成创
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN202110969847.0A priority Critical patent/CN114286137A/en
Priority to PCT/CN2022/084106 priority patent/WO2022242328A1/en
Priority to CN202280026627.7A priority patent/CN117157987A/en
Publication of CN114286137A publication Critical patent/CN114286137A/en
Priority to US18/510,339 priority patent/US20240089526A1/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Controls And Circuits For Display Device (AREA)

Abstract

Some embodiments of the present application may utilize a UPNP protocol to implement that terminals of different systems use the same screen projection technology to complete screen projection operations. The method comprises the following steps: starting up a display device, starting up a DMR function, establishing device service capability, adding a multicast group of a UPNP protocol so that a terminal determines an online device according to the multicast group, controlling a display interface to display a screen mirror image control of the online device, and sending video stream data obtained by recording a screen in real time when the terminal receives an instruction of selecting the screen mirror image control corresponding to the online device; and the display equipment receives the video stream data and decodes and plays the video stream data.

Description

Mirror image screen projection method, display device and terminal
Technical Field
The application relates to the technical field of mirror image screen projection, in particular to a mirror image screen projection method, display equipment and a terminal.
Background
The display device is used as a large-screen device, and can provide better viewing experience for users. In the related art, a user can project content on a mobile phone to a display device, so that the content on the mobile phone is watched on the display device, and the watched image is clearer. However, the dependence of the screen projection technology on the system is very strong, terminals of different systems need to use different screen projection technologies to realize mirror image screen projection with the display device, and the situation that multiple systems use the same screen projection technology to complete mirror image screen projection operation cannot be achieved.
Disclosure of Invention
Some embodiments of the present application provide a method, a display device, and a terminal for screen projection of a broadcast image, which can implement that terminals of different systems use the same screen projection technology to complete screen projection operations.
In a first aspect, there is provided a display device comprising:
a display for displaying a user interface;
a user interface for receiving an input signal;
a controller respectively coupled to the display and the user interface for performing:
starting up a display device, starting up a DMR function, establishing device service capability, adding a multicast group of a UPNP protocol so that a terminal determines an online device according to the multicast group, controlling a display interface to display a screen mirror image control of the online device, and sending video stream data obtained by recording a screen in real time when the terminal receives an instruction of selecting the screen mirror image control corresponding to the online device; and the display equipment receives the video stream data and decodes and plays the video stream data.
In some embodiments, the controller is further configured to perform: and after joining the multicast group, sending an online message.
In some embodiments, the controller is further configured to perform: receiving search information sent to the multicast group by a terminal;
determining whether the display equipment has the DMR capability or not according to the search information;
and if the display equipment has the DMR capability, sending a search success message to the terminal.
In some embodiments, the controller is further configured to perform:
the method comprises the steps of receiving a push message sent by a terminal, analyzing the push message to obtain a cache address, and sending a video acquisition request to the terminal, wherein the video acquisition request comprises the cache address.
A second aspect provides a terminal for performing:
starting a terminal, starting functions of a DMC and a DMS, adding a multicast group of a UPNP protocol, determining an online device through the multicast group, and controlling a display interface to display a screen mirror image control of the online device; and receiving an instruction of selecting the screen mirror image control, and sending video stream data obtained by recording the screen in real time.
In some embodiments, the terminal is configured to determine an online device through the multicast group according to the following steps:
receiving an online message of the display equipment, and analyzing the equipment service capability of the display equipment;
and if the display equipment has the DMR capability, determining that the display equipment is online equipment.
In some embodiments, the terminal is configured to determine an online device through the multicast group according to the following steps:
receiving the operation of searching the display equipment, sending search information to the multicast group, receiving the search success information sent by the display equipment according to the search information, receiving the search success information, and determining that the display equipment is online equipment.
In some embodiments, the terminal is configured to execute the following steps of receiving an instruction of selecting the screen mirroring control, and sending video stream data obtained by real-time screen recording:
receiving an instruction of selecting a screen mirror image control, and sending a push message to display equipment, wherein the push message comprises a cache address;
and receiving a video acquisition request, starting to record a screen to obtain video stream data, and storing the video stream data in a cache corresponding to the cache address.
And reading the video stream data in the buffer and sending the video stream data to the display equipment.
In a third aspect, a mirror image projection method is provided, which is applied to a display device, and includes:
starting up a display device, starting up a DMR function, establishing device service capability, adding a multicast group of a UPNP protocol so that a terminal determines an online device according to the multicast group, controlling a display interface to display a screen mirror image control of the online device, and sending video stream data obtained by recording a screen in real time when the terminal receives an instruction of selecting the screen mirror image control corresponding to the online device; and the display equipment receives the video stream data and decodes and plays the video stream data.
In a fourth aspect, a mirror image screen projection method is provided, which is applied to a terminal and includes:
starting a terminal, starting functions of a DMC and a DMS, adding a multicast group of a UPNP protocol, determining an online device through the multicast group, and controlling a display interface to display a screen mirror image control of the online device; and receiving an instruction of selecting the screen mirror image control, and sending video stream data obtained by recording the screen in real time.
In the embodiments, some embodiments of the present application may utilize a UPNP protocol to implement that terminals of different systems use the same screen projection technology to complete screen projection operations. The method comprises the following steps: starting up a display device, starting up a DMR function, establishing device service capability, adding a multicast group of a UPNP protocol so that a terminal determines an online device according to the multicast group, controlling a display interface to display a screen mirror image control of the online device, and sending video stream data obtained by recording a screen in real time when the terminal receives an instruction of selecting the screen mirror image control corresponding to the online device; and the display equipment receives the video stream data and decodes and plays the video stream data.
Drawings
FIG. 1 illustrates a usage scenario of a display device according to some embodiments;
fig. 2 illustrates a hardware configuration block diagram of the control apparatus 100 according to some embodiments;
FIG. 3 is a flow diagram illustrating a method of playing data according to some embodiments;
a user interface diagram according to some embodiments is illustrated in fig. 4;
a flow diagram of a method of cast image screen projection according to some embodiments is illustrated in fig. 5;
a flowchart of a method for image projection according to still further embodiments is illustrated in fig. 6.
Detailed Description
To make the purpose and embodiments of the present application clearer, the following will clearly and completely describe the exemplary embodiments of the present application with reference to the attached drawings in the exemplary embodiments of the present application, and it is obvious that the described exemplary embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
It should be noted that the brief descriptions of the terms in the present application are only for the convenience of understanding the embodiments described below, and are not intended to limit the embodiments of the present application. These terms should be understood in their ordinary and customary meaning unless otherwise indicated.
The terms "first," "second," "third," and the like in the description and claims of this application and in the above-described drawings are used for distinguishing between similar or analogous objects or entities and not necessarily for describing a particular sequential or chronological order, unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances.
The terms "comprises" and "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to all elements expressly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The term "module" refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware or/and software code that is capable of performing the functionality associated with that element.
Fig. 1 is a schematic diagram of an operation scenario between a display device and a control apparatus according to an embodiment. As shown in fig. 1, a user may operate the display apparatus 200 through the smart device 300 or the control device 100.
In some embodiments, the control apparatus 100 may be a remote controller, and the communication between the remote controller and the display device includes an infrared protocol communication or a bluetooth protocol communication, and other short-distance communication methods, and controls the display device 200 in a wireless or wired manner. The user may input a user instruction through a key on a remote controller, voice input, control panel input, etc., to control the display apparatus 200.
In some embodiments, the smart device 300 (e.g., mobile terminal, tablet, computer, laptop, etc.) may also be used to control the display device 200. For example, the display device 200 is controlled using an application program running on the smart device.
In some embodiments, the display device 200 may also be controlled in a manner other than the control apparatus 100 and the smart device 300, for example, the voice command control of the user may be directly received by a module configured inside the display device 200 to obtain a voice command, or may be received by a voice control device provided outside the display device 200.
In some embodiments, the display device 200 is also in data communication with a server 400. The display device 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display apparatus 200. The server 400 may be a cluster or a plurality of clusters, and may include one or more types of servers.
Fig. 2 exemplarily shows a block diagram of a configuration of the control apparatus 100 according to an exemplary embodiment. As shown in fig. 2, the control device 100 includes a controller 110, a communication interface 130, a user input/output interface 140, a memory, and a power supply. The control apparatus 100 may receive an input operation instruction from a user and convert the operation instruction into an instruction recognizable and responsive by the display device 200, serving as an interaction intermediary between the user and the display device 200.
Fig. 3 shows a hardware configuration block diagram of the display apparatus 200 according to an exemplary embodiment.
In some embodiments, the display apparatus 200 includes at least one of a tuner demodulator 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a display 260, an audio output interface 270, a memory, a power supply, a user interface.
In some embodiments the controller comprises a processor, a video processor, an audio processor, a graphics processor, a RAM, a ROM, a first interface to an nth interface for input/output.
In some embodiments, the display 260 includes a display screen component for presenting a picture, and a driving component for driving an image display, a component for receiving an image signal from the controller output, performing display of video content, image content, and a menu manipulation interface, and a user manipulation UI interface.
In some embodiments, the display 260 may be a liquid crystal display, an OLED display, and a projection display, and may also be a projection device and a projection screen.
In some embodiments, communicator 220 is a component for communicating with external devices or servers according to various communication protocol types. For example: the communicator may include at least one of a Wifi module, a bluetooth module, a wired ethernet module, and other network communication protocol chips or near field communication protocol chips, and an infrared receiver. The display apparatus 200 may establish transmission and reception of control signals and data signals with the external control apparatus 100 or the server 400 through the communicator 220.
In some embodiments, the user interface may be configured to receive control signals for controlling the apparatus 100 (e.g., an infrared remote control, etc.).
In some embodiments, the detector 230 is used to collect signals of the external environment or interaction with the outside. For example, detector 230 includes a light receiver, a sensor for collecting ambient light intensity; alternatively, the detector 230 includes an image collector, such as a camera, which may be used to collect external environment scenes, attributes of the user, or user interaction gestures, or the detector 230 includes a sound collector, such as a microphone, which is used to receive external sounds.
In some embodiments, the external device interface 240 may include, but is not limited to, the following: high Definition Multimedia Interface (HDMI), analog or data high definition component input interface (component), composite video input interface (CVBS), USB input interface (USB), RGB port, and the like. The interface may be a composite input/output interface formed by the plurality of interfaces.
In some embodiments, the tuner demodulator 210 receives broadcast television signals via wired or wireless reception, and demodulates audio/video signals, such as EPG data signals, from a plurality of wireless or wired broadcast television signals.
In some embodiments, the controller 250 and the modem 210 may be located in different separate devices, that is, the modem 210 may also be located in an external device of the main device where the controller 250 is located, such as an external set-top box.
In some embodiments, the controller 250 controls the operation of the display device and responds to user operations through various software control programs stored in memory. The controller 250 controls the overall operation of the display apparatus 200. For example: in response to receiving a user command for selecting a UI object to be displayed on the display 260, the controller 250 may perform an operation related to the object selected by the user command.
In some embodiments, the object may be any one of selectable objects, such as a hyperlink, an icon, or other actionable control. The operations related to the selected object are: displaying an operation connected to a hyperlink page, document, image, or the like, or performing an operation of a program corresponding to the icon.
In some embodiments the controller comprises at least one of a Central Processing Unit (CPU), a video processor, an audio processor, a Graphics Processing Unit (GPU), a ramandom Access Memory, RAM), ROM (Read-Only Memory), a first to nth interface for input/output, a communication Bus (Bus), and the like.
A CPU processor. For executing operating system and application program instructions stored in the memory, and executing various application programs, data and contents according to various interactive instructions receiving external input, so as to finally display and play various audio-video contents. The CPU processor may include a plurality of processors. E.g. comprising a main processor and one or more sub-processors.
In some embodiments, a graphics processor for generating various graphics objects, such as: icons, operation menus, user input instruction display graphics, and the like. The graphic processor comprises an arithmetic unit, which performs operation by receiving various interactive instructions input by a user and displays various objects according to display attributes; the system also comprises a renderer for rendering various objects obtained based on the arithmetic unit, wherein the rendered objects are used for being displayed on a display.
In some embodiments, the video processor is configured to receive an external video signal, and perform video processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to a standard codec protocol of the input signal, so as to obtain a signal that can be displayed or played on the direct display device 200.
In some embodiments, the video processor includes a demultiplexing module, a video decoding module, an image synthesis module, a frame rate conversion module, a display formatting module, and the like. The demultiplexing module is used for demultiplexing the input audio and video data stream. And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like. And the image synthesis module is used for carrying out superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphic generator so as to generate an image signal for display. And the frame rate conversion module is used for converting the frame rate of the input video. And the display formatting module is used for converting the received video output signal after the frame rate conversion, and changing the signal to be in accordance with the signal of the display format, such as an output RGB data signal.
In some embodiments, the audio processor is configured to receive an external audio signal, decompress and decode the received audio signal according to a standard codec protocol of the input signal, and perform noise reduction, digital-to-analog conversion, and amplification processing to obtain an audio signal that can be played in the speaker.
In some embodiments, a user may enter user commands on a Graphical User Interface (GUI) displayed on display 260, and the user input interface receives the user input commands through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
In some embodiments, a "user interface" is a media interface for interaction and information exchange between an application or operating system and a user that enables conversion between an internal form of information and a form that is acceptable to the user. A commonly used presentation form of the User Interface is a Graphical User Interface (GUI), which refers to a User Interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in the display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
In some embodiments, a system of a display device may include a Kernel (Kernel), a command parser (shell), a file system, and an application program. The kernel, shell, and file system together make up the basic operating system structure that allows users to manage files, run programs, and use the system. After power-on, the kernel is started, kernel space is activated, hardware is abstracted, hardware parameters are initialized, and virtual memory, a scheduler, signals and interprocess communication (IPC) are operated and maintained. And after the kernel is started, loading the Shell and the user application program. The application program is compiled into machine code after being started, and a process is formed.
Referring to fig. 4, in some embodiments, the system is divided into four layers, which are, from top to bottom, an Application (Applications) layer (referred to as an "Application layer"), an Application Framework (Application Framework) layer (referred to as a "Framework layer"), an Android runtime (Android runtime) layer and a system library layer (referred to as a "system runtime library layer"), and a kernel layer.
In some embodiments, at least one application program runs in the application program layer, and the application programs may be windows (windows) programs carried by an operating system, system setting programs, clock programs or the like; or an application developed by a third party developer. In particular implementations, the application packages in the application layer are not limited to the above examples.
The framework layer provides an Application Programming Interface (API) and a programming framework for the application. The application framework layer includes a number of predefined functions. The application framework layer acts as a processing center that decides to let the applications in the application layer act. The application program can access the resources in the system and obtain the services of the system in execution through the API interface.
As shown in fig. 4, in the embodiment of the present application, the application framework layer includes a manager (Managers), a Content Provider (Content Provider), and the like, where the manager includes at least one of the following modules: an Activity Manager (Activity Manager) is used for interacting with all activities running in the system; the Location Manager (Location Manager) is used for providing the system service or application with the access of the system Location service; a Package Manager (Package Manager) for retrieving various information related to an application Package currently installed on the device; a Notification Manager (Notification Manager) for controlling display and clearing of Notification messages; a Window Manager (Window Manager) is used to manage the icons, windows, toolbars, wallpapers, and desktop components on a user interface.
In some embodiments, the activity manager is used to manage the lifecycle of the various applications as well as general navigational fallback functions, such as controlling exit, opening, fallback, etc. of the applications. The window manager is used for managing all window programs, such as obtaining the size of a display screen, judging whether a status bar exists, locking the screen, intercepting the screen, controlling the change of the display window (for example, reducing the display window, displaying a shake, displaying a distortion deformation, and the like), and the like.
In some embodiments, the system runtime layer provides support for the upper layer, i.e., the framework layer, and when the framework layer is used, the android operating system runs the C/C + + library included in the system runtime layer to implement the functions to be implemented by the framework layer.
In some embodiments, the kernel layer is a layer between hardware and software. As shown in fig. 4, the core layer includes at least one of the following drivers: audio drive, display driver, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (like fingerprint sensor, temperature sensor, pressure sensor etc.) and power drive etc..
The display device is used as a large-screen device, and can provide better viewing experience for users. In the related art, a user can project content on a mobile phone to a display device, so that the content on the mobile phone is watched on the display device, and the watched image is clearer. However, the dependence of the screen projection technology on the system is very strong, terminals of different systems need to use different screen projection technologies to realize mirror image screen projection with the display device, and the situation that multiple systems use the same screen projection technology to complete mirror image screen projection operation cannot be achieved. Illustratively, the mobile phone with the android system type uses the miracast technology to complete the mirror image screen projection, and the mobile phone with the IOS system type uses the airplay technology to complete the mirror image screen projection.
Before describing a mirror image screen projection method provided by the embodiment of the present application in detail, first, a UPNP protocol (Universal Plug and Play) is introduced, and a technical basis of DLNA (DIGITAL LIVING network alliance ) is the UPNP protocol (Universal Plug and Play), and a workflow of the UPNP protocol is briefly described as follows: step 1: first, addressing, which instructs the device to obtain a network address. Step 2: a device discovery process is performed in which the control point (a device type of UPNP device) looks for device devices on the entire network, and the devices also alert themselves to the presence, and the discovery process may obtain some information about the devices, such as the device type, the globally unique identifier of the device, and the address of the device description. And step 3: the description process is executed to obtain the detailed information in the description file, and the detailed capability of the device can be known in the step. And 4, step 4: control steps are performed, such as sending push information, and responding to push information, etc. And 5: an event is executed, which is similar to the observer mode, and any state change of the device informs the observer. Step 6: and a display step of displaying the information and the state of the equipment as the supplement of the control step and the event.
In the whole process, different devices have different roles and different functions. The UPNP devices are in the form of dmr (digital Media renderer), dms (digital Media server) and dmc (digital Media control point), respectively. The DMR device may support media play and control, the DMS device may support functionality providing media acquisition, recording, storage, and output, and the DMC device may control a UPNP device. According to the role definition and the whole work flow, the UPNP protocol can realize a media pushing function, the media pushing refers to that after the DMC finds the DMR device, the media resources at the DMC end are pushed to the DMR end, it needs to be explained that the media resources are not the content of screen image projection, the DMR obtains the media file stream from the DMS end to play, the DMC can control the playing of the media at the DMR end, and meanwhile, the media playing state at the DMR end can be returned to the DMC end.
For example, a television side serves as a DMR, a mobile phone side serves as a DMC and a DMS, and the mobile phone first discovers the television and then pushes media resources in the mobile phone to the television side for display.
The following describes in detail a mirror image screen projection method according to an embodiment of the present application, which can implement mirror image screen projection for real-time video streaming transmission based on a UPNP protocol, and does not limit a terminal system type using the method, as shown in fig. 5, the method includes:
s100, starting the terminal, starting the functions of the DMC and the DMS, and adding a multicast group of the UPNP protocol. S200, determining online equipment through the multicast group. S300, controlling a display interface to display a screen mirror image control of the online equipment. In the embodiment of the application, the terminal plays roles of DMC and DMS, and starts the functions of DMC and DMS.
The display equipment is started, the DMR function is started, the equipment service capability is established, and a multicast group of the UPNP protocol is added. In the embodiment of the application, the display device plays a role of DMR and starts a DMR function. The creating of the device service capability refers to creating of a DMR service capability.
Illustratively, the multicast group address of the UPNP protocol may be 239.255.255.250. According to the embodiment of the application, the terminal can select the equipment needing mirror image screen projection, namely the on-line equipment.
In the embodiment of the application, the online equipment is determined in an automatic or manual mode.
Firstly, the method for determining the online equipment in an automatic mode is introduced.
In some embodiments, the display device sends the online message after joining the multicast group.
The terminal receives the online message of the display equipment and analyzes the equipment service capability of the display equipment;
and if the display equipment has the DMR capability, determining that the display equipment is online equipment.
In some embodiments, the online message includes a local network address of the display device. And the terminal determines the equipment corresponding to the local network address of the display equipment as online equipment according to the local network address of the display equipment.
In the embodiment of the application, the display device can be mirrored and projected with the terminal only when the display device has the DMR capability. In this embodiment, when receiving an online message of the display device, the display device serves as an online device that can be projected to a screen to be mirrored with the terminal.
The determination of the threading device by manual means is described below.
In some embodiments, the terminal receives an operation of searching for a display device and sends search information to the multicast group.
The display equipment receives search information sent to the multicast group by the terminal; determining whether the display equipment has the DMR capability or not according to the search information; and if the display equipment has the DMR capability, sending a search success message to the terminal. And the terminal receives the search success message and determines that the display equipment is online equipment.
For example, the user may enter the name of the display device in a search box, at which time the device service capabilities of the display device are resolved. In this embodiment, when it is determined that the display device has the DMR capability, the display device is used as an online device capable of being projected to a screen to be mirrored by a terminal.
In some embodiments, the lookup information includes a local network address of the terminal. And the display equipment sends a search success message to the terminal according to the local network address of the terminal.
The two modes are respectively automatic or manual to determine the online equipment.
In some embodiments, after the online device is determined, the terminal controls a display interface to display a screen mirror control of the online device. S400, receiving an instruction of selecting the screen mirror image control, and sending video stream data obtained by real-time screen recording to display equipment. And the display equipment receives the video stream data and decodes and plays the video stream data.
In some embodiments, the step of receiving an instruction to select the screen mirroring control and sending video stream data obtained by real-time screen recording, as shown in fig. 6, includes:
s401, the terminal receives an instruction of selecting the screen mirror image control and sends a push message to the display device, wherein the push message comprises a cache address. Illustratively, a screen mirroring control may be displayed on a user interface of the terminal, and a user may touch a position on the screen corresponding to the screen mirroring control to generate an instruction for selecting the screen mirroring control.
In the embodiment of the application, the cache corresponding to the cache address is generated when an instruction for selecting the screen mirror image control is received. And carrying the cache address of the cache when sending the push message to the display equipment.
The display equipment receives a push message sent by a terminal, analyzes the push message to obtain a cache address, and sends a video acquisition request to the terminal, wherein the video acquisition request comprises the cache address.
In some embodiments, the display device may determine whether to send the video obtaining request to the terminal according to an actual situation, and may not send the video obtaining request to the terminal if the display device does not have a condition of screen projection in a mirror image with the terminal.
S402, the terminal receives the video acquisition request, starts to record the screen to obtain video stream data, and stores the video stream data in a cache corresponding to the cache address. In the embodiment of the application, the video stream data obtained by recording the screen is the content displayed by the terminal in real time.
And S403, the terminal reads the video stream data in the cache and sends the video stream data to the display equipment. It should be noted that as long as the video stream data exists in the buffer, the video stream data is always sent to the display device.
In some embodiments, the terminal receives a first instruction to stop the mirror image projection, stops recording the screen, stops reading the video stream data from the buffer, and sends the video stream data to the display device.
In some embodiments, the display device receives an instruction of stopping the mirror projection, and stops decoding and playing the video stream data.
In some embodiments, the display device receives an instruction to pause decoding of the video stream data, and pauses decoding of the video stream data.
In some embodiments, the display device receives an instruction to re-decode the played video stream data, and continues to decode the played video stream data. In the embodiment of the present application, the instruction to re-decode the played video stream data is received after the decoding of the played video stream data is suspended.
In order to ensure that a real-time video stream can be transmitted, the embodiment of the application improves the data format of displaying and sending the video acquisition request to the terminal, and the data format of sending the video stream data to the display device after the terminal receives the video acquisition request, namely, the data format of requesting data and replying data.
The UPNP protocol is a video stream transmitted by the HTTP protocol, the display equipment sends a video acquisition request, the terminal receives the request and replies and transmits a corresponding video stream, and the specific interactive format is as follows:
(1) the UPNP protocol transmits a local file format:
request data format:
GET/XXXX.mp4 HTTP/1.1
Host:XXXXXX
Connection:close
Range:bytes=0
User-Agent:hsp/1.0(Linix:Android 6.0)
the reply data format:
HTTP/1.1 206 Partial Content
Content-Type:video/mp4
Content-Length:3234061
Accept-Ranges:bytes
Content-Ranges:bytes 0-3234060/3234061
TransferMode.DLNA.ORG:Streaming
ContentFeatures.DLNA.ORG:
DLNA.ORG_PN=AVC_MP4_BL_CIF15_AAC_520;DLNA.ORG_OP=01:DLNA.ORG_CI=1;DLNA.ORG_FLAGS=01500000000000000000000000000000
Connection:close
Date:Tue,09 Mar 2021 05:42:26GMT
(2) the format of real-time video stream transmitted by UPNP protocol:
request data format:
GET/cacdbd8e6fed4e2a06a5a32a3ced76021988f44f HTTP/1.1
Host:192.168.1.125:8081
Connection:close
Range:bytes=0-
User-Agent:hsp/1.0(Linux;Android 6.0)
the reply data format:
HTTP/1.1 200 OK
Content-Type:video/vnd.dlna.mpeg-tts
transferMode.dlna.org:Streaming
contentFeatures.dlna.org:
DLNA.ORG_CI=1;DLNA.ORG_FLAGS=01300000000000000000000000000000
Connection:Keep-Alive
Date:Tue.09 Mar 2021 07:54:23 GMT
Cache-Control:no-store,no-cache,must-revalidate
Transfer-Encoding:chunked
the UPNP protocol transmits real-time video streaming format, and the following contents need to be noted:
the buffer address is generated to have uniqueness, so that the video stream is marked by a unique ID generated by MD5 encryption, when the local file MD5 is generated, a file path is generally used as a seed, but the real-time video stream has no uniform method, a time stamp of the video stream can be acquired as the seed to generate an MD5 value, the time stamp is unique, the generated MD5 value is unique, and the uniqueness of the buffer address of the acquired video stream is ensured.
Since the video stream has no Length, the Content-Length, Accept-Ranges, and Content-Range fields should not respond any more.
The Content-Type field is specifically filled according to the format of the transport video stream, for example, the video stream is TS-encapsulated and can be filled in video/vnd.
Org _ FLAGS ═ 01300000000000000000000000000000: the UPNP protocol specifies that each digit is an octal number, the upper 8 bits are the valid value and the lower 24 bits are the reserved value, converting the upper 8 bits into binary format: 0000,0001,0011, respectively; where 24-bit is the tm-s (Streaming Mode flag) flag, if DLNA version 1.5 is supported (20-bit is DLNA-v1.5-flag), this value must be set to 1, i.e. Streaming Mode is supported, and the terminal must transmit data to the display device fast enough once it has.
The Connection field should be Keep-Alive, so as to reduce the time consumed by TCP Connection when sending HTTP request, and allow the display device and the terminal to continue to transmit contents through the Connection.
Cache-Control: no-store, no-cache, must-revalidate: requiring that the client must revalidate to the server each time a request is made.
Transfer-Encoding: chunked: HTTP is specified as a persistent connection, but length and range (unit of defining a range request) cannot be calculated according to the characteristics of a real-time video stream, and data transmission by chunked must be specified. Transfer-Encoding was specified: chunked, which subsequently needs to pack the content entity into one block for transmission.
In the embodiments, some embodiments of the present application may utilize a UPNP protocol to implement that terminals of different systems use the same screen projection technology to complete screen projection operations. The method comprises the following steps: starting up a display device, starting up a DMR function, establishing device service capability, adding a multicast group of a UPNP protocol so that a terminal determines an online device according to the multicast group, controlling a display interface to display a screen mirror image control of the online device, and sending video stream data obtained by recording a screen in real time when the terminal receives an instruction of selecting the screen mirror image control corresponding to the online device; and the display equipment receives the video stream data and decodes and plays the video stream data.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.
The foregoing description, for purposes of explanation, has been presented in conjunction with specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the embodiments to the precise forms disclosed above. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles and the practical application, to thereby enable others skilled in the art to best utilize the embodiments and various embodiments with various modifications as are suited to the particular use contemplated.

Claims (10)

1. A display device, comprising:
a display for displaying a user interface;
a user interface for receiving an input signal;
a controller respectively coupled to the display and the user interface for performing:
starting up a display device, starting up a DMR function, establishing device service capability, adding a multicast group of a UPNP protocol so that a terminal determines an online device according to the multicast group, controlling a display interface to display a screen mirror image control of the online device, and sending video stream data obtained by recording a screen in real time when the terminal receives an instruction of selecting the screen mirror image control corresponding to the online device; and the display equipment receives the video stream data and decodes and plays the video stream data.
2. The display device according to claim 1, wherein the controller is further configured to perform:
and after joining the multicast group, sending an online message.
3. The display device according to claim 1, wherein the controller is further configured to perform: receiving search information sent to the multicast group by a terminal;
determining whether the display equipment has the DMR capability or not according to the search information;
and if the display equipment has the DMR capability, sending a search success message to the terminal.
4. The display device according to claim 1, wherein the controller is further configured to perform:
the method comprises the steps of receiving a push message sent by a terminal, analyzing the push message to obtain a cache address, and sending a video acquisition request to the terminal, wherein the video acquisition request comprises the cache address.
5. A terminal, characterized in that it is configured to perform:
starting a terminal, starting functions of a DMC and a DMS, adding a multicast group of a UPNP protocol, determining an online device through the multicast group, and controlling a display interface to display a screen mirror image control of the online device; and receiving an instruction of selecting the screen mirror image control, and sending video stream data obtained by recording the screen in real time.
6. The terminal of claim 5, wherein the terminal is configured to determine an online device through the multicast group according to the following steps:
receiving an online message of the display equipment, and analyzing the equipment service capability of the display equipment;
and if the display equipment has the DMR capability, determining that the display equipment is online equipment.
7. The terminal of claim 5, wherein the terminal is configured to determine an online device through the multicast group according to the following steps:
receiving the operation of searching the display equipment, sending search information to the multicast group, receiving the search success information sent by the display equipment according to the search information, receiving the search success information, and determining that the display equipment is online equipment.
8. The terminal according to claim 5, wherein the terminal is configured to execute the following steps of receiving an instruction of selecting the screen mirroring control, and sending video stream data obtained by real-time screen recording:
receiving an instruction of selecting a screen mirror image control, and sending a push message to display equipment, wherein the push message comprises a cache address;
receiving a video acquisition request, starting to record a screen to obtain video stream data, and storing the video stream data in a cache corresponding to a cache address;
and reading the video stream data in the buffer and sending the video stream data to the display equipment.
9. A mirror image screen projection method is applied to display equipment and comprises the following steps:
starting up a display device, starting up a DMR function, establishing device service capability, adding a multicast group of a UPNP protocol so that a terminal determines an online device according to the multicast group, controlling a display interface to display a screen mirror image control of the online device, and sending video stream data obtained by recording a screen in real time when the terminal receives an instruction of selecting the screen mirror image control corresponding to the online device; and the display equipment receives the video stream data and decodes and plays the video stream data.
10. A mirror image screen projection method is applied to a terminal and comprises the following steps:
starting a terminal, starting functions of a DMC and a DMS, adding a multicast group of a UPNP protocol, determining an online device through the multicast group, and controlling a display interface to display a screen mirror image control of the online device; and receiving an instruction of selecting the screen mirror image control, and sending video stream data obtained by recording the screen in real time.
CN202110969847.0A 2021-05-17 2021-08-23 Mirror image screen projection method, display device and terminal Pending CN114286137A (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CN202110969847.0A CN114286137A (en) 2021-08-23 2021-08-23 Mirror image screen projection method, display device and terminal
PCT/CN2022/084106 WO2022242328A1 (en) 2021-05-17 2022-03-30 Method for playback in split screen and display device
CN202280026627.7A CN117157987A (en) 2021-05-17 2022-03-30 Split-screen playing method and display device
US18/510,339 US20240089526A1 (en) 2021-05-17 2023-11-15 Method for playback in split screen and display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110969847.0A CN114286137A (en) 2021-08-23 2021-08-23 Mirror image screen projection method, display device and terminal

Publications (1)

Publication Number Publication Date
CN114286137A true CN114286137A (en) 2022-04-05

Family

ID=80868432

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110969847.0A Pending CN114286137A (en) 2021-05-17 2021-08-23 Mirror image screen projection method, display device and terminal

Country Status (1)

Country Link
CN (1) CN114286137A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI826203B (en) * 2022-12-15 2023-12-11 技嘉科技股份有限公司 Computer device and display device
WO2024012344A1 (en) * 2022-07-14 2024-01-18 华为技术有限公司 Screen mirroring method and electronic device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105791351A (en) * 2014-12-23 2016-07-20 深圳Tcl数字技术有限公司 Method and system for realizing screen pushing based on DLNA technology
CN107896339A (en) * 2017-10-30 2018-04-10 努比亚技术有限公司 A kind of video broadcasting method, terminal and computer-readable recording medium
CN111427527A (en) * 2020-03-20 2020-07-17 海信视像科技股份有限公司 Screen projection method, device, equipment and computer readable storage medium
CN112099750A (en) * 2020-09-24 2020-12-18 Oppo广东移动通信有限公司 Screen sharing method, terminal, computer storage medium and system
CN112616065A (en) * 2020-12-16 2021-04-06 深圳乐播科技有限公司 Screen image initiating method and device, computer equipment, readable storage medium and screen image presenting system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105791351A (en) * 2014-12-23 2016-07-20 深圳Tcl数字技术有限公司 Method and system for realizing screen pushing based on DLNA technology
CN107896339A (en) * 2017-10-30 2018-04-10 努比亚技术有限公司 A kind of video broadcasting method, terminal and computer-readable recording medium
CN111427527A (en) * 2020-03-20 2020-07-17 海信视像科技股份有限公司 Screen projection method, device, equipment and computer readable storage medium
CN112099750A (en) * 2020-09-24 2020-12-18 Oppo广东移动通信有限公司 Screen sharing method, terminal, computer storage medium and system
CN112616065A (en) * 2020-12-16 2021-04-06 深圳乐播科技有限公司 Screen image initiating method and device, computer equipment, readable storage medium and screen image presenting system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
景东: "《Web应用开发技术》", 中国铁道出版社, pages: 136 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024012344A1 (en) * 2022-07-14 2024-01-18 华为技术有限公司 Screen mirroring method and electronic device
TWI826203B (en) * 2022-12-15 2023-12-11 技嘉科技股份有限公司 Computer device and display device

Similar Documents

Publication Publication Date Title
CN114302219B (en) Display equipment and variable frame rate display method
CN112367543B (en) Display device, mobile terminal, screen projection method and screen projection system
CN112612443B (en) Audio playing method, display device and server
CN113507638B (en) Display equipment and screen projection method
CN113407142A (en) Display device and screen projection method
CN113064645B (en) Startup interface control method and display device
CN112672195A (en) Remote controller key setting method and display equipment
CN114339332B (en) Mobile terminal, display device and cross-network screen projection method
CN114286165A (en) Display device, mobile terminal and screen projection data transmission method
CN112486934B (en) File synchronization method and display device
CN112911380B (en) Display device and connection method with Bluetooth device
WO2022105409A1 (en) Fault diagnosis method, terminal device, and display device
WO2022048203A1 (en) Display method and display device for manipulation prompt information of input method control
CN112632160A (en) Intelligent device and intelligent device login method
CN114286137A (en) Mirror image screen projection method, display device and terminal
CN113111214A (en) Display method and display equipment for playing records
CN114915810B (en) Media resource pushing method and intelligent terminal
CN115022688A (en) Display device and media data relay method
CN114302199A (en) Display apparatus and data sharing method
CN112653608A (en) Display device, mobile terminal and cross-network data transmission method
CN112929724B (en) Display device, set top box and far-field pickup awakening control method
CN112752152B (en) Delivery video playing method and display equipment
CN114915818B (en) Media resource pushing method and intelligent terminal
CN112231088B (en) Browser process optimization method and display device
CN114302378A (en) Bluetooth mode switching method of display device, display device and terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20220405