CN116360724A - Processing method and electronic equipment - Google Patents

Processing method and electronic equipment Download PDF

Info

Publication number
CN116360724A
CN116360724A CN202310345018.4A CN202310345018A CN116360724A CN 116360724 A CN116360724 A CN 116360724A CN 202310345018 A CN202310345018 A CN 202310345018A CN 116360724 A CN116360724 A CN 116360724A
Authority
CN
China
Prior art keywords
data
screen
connection
equipment
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310345018.4A
Other languages
Chinese (zh)
Inventor
王培超
万建武
汪兵
郭纯纯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN202310345018.4A priority Critical patent/CN116360724A/en
Publication of CN116360724A publication Critical patent/CN116360724A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • H04N21/43076Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of the same content streams on multiple devices, e.g. when family members are watching the same movie on different devices

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The application discloses a processing method and electronic equipment, wherein the method applied to first equipment comprises the following steps: outputting an electronic information code, wherein the electronic information code at least comprises first information of the first equipment, so that a second equipment for obtaining the electronic information code establishes a first connection with the first equipment at least according to the first information; the first connection is a point-to-point type connection; obtaining screen projection data of the second equipment at least according to the first connection; and outputting the screen throwing data.

Description

Processing method and electronic equipment
Technical Field
The application belongs to the technical field of computer application, and particularly relates to a processing method and electronic equipment.
Background
Currently, the capability of mutual collaboration between devices based on the ChromeOS (operating system) is relatively limited. The limited collaboration is usually performed between android Message applications based on a mail system or on a Chrome support and mobile terminal equipment, and when a screen service between the equipment is provided for a user, screen data are transmitted through the internet, so that the defect of high implementation complexity exists.
Disclosure of Invention
For this reason, the application discloses a processing method and an electronic device, as follows:
a processing method applied to a first device, the method comprising:
outputting an electronic information code, wherein the electronic information code at least comprises first information of the first equipment, so that a second equipment for obtaining the electronic information code establishes a first connection with the first equipment at least according to the first information; the first connection is a point-to-point type connection;
obtaining screen projection data of the second equipment at least according to the first connection;
and outputting the screen throwing data.
In the above method, preferably, the first connection is used for instructing the second device to create a socket transmission channel with the first device according to the first information;
the screen projection data of the second device is obtained at least according to the first connection, and the screen projection data comprises:
and receiving the screen projection data transmitted by the second equipment through the socket transmission channel.
In the above method, preferably, outputting the screen projection data includes:
according to the screen throwing data, obtaining a network abstract layer data frame corresponding to the screen throwing data;
analyzing the network abstraction layer data frames to obtain a plurality of screen recording image frames;
decoding the screen recording image frame through a target component to obtain decoded data;
converting the decoded data according to a target image format to obtain a target image frame;
outputting the target image frame.
In the above method, preferably, after parsing the network abstraction layer data frame to obtain a plurality of video frames, before decoding, by the target component, the video frames to obtain decoded data, the method further includes:
adding the screen recording image frames to a pre-configured frame queue; the frame queues are first-in first-out queues, the number of the frame queues is multiple, the number of the frame queues is determined according to the data type of the screen throwing data, and the data type is related to the data quantity of the screen throwing data;
wherein decoding, by the target component, the video-recording image frame to obtain decoded data includes:
reading the screen recording image frames from each frame queue respectively;
and calling a target component to enable the target component to decode the screen image frame so as to obtain decoded data.
In the above method, preferably, after outputting the screen-projection data, the method further includes:
acquiring a first input operation aiming at the screen throwing data through a preconfigured monitoring component;
in response to the first input operation, a first instruction is executed that causes the second device to execute a second instruction.
A processing method applied to a second device, the method comprising:
scanning an electronic information code output by a first device to obtain first information of the first device contained in the electronic information code;
establishing a first connection with the first device at least according to the first information; the first connection is a point-to-point type connection;
obtaining screen projection data of the second equipment;
and transmitting the screen projection data to the first equipment at least according to the first connection so that the first equipment outputs the screen projection data.
The above method, preferably, after establishing a first connection with the first device according to at least the first information, the method further includes:
based on the first connection, creating a socket transmission channel between the socket transmission channel and the first equipment according to the first information;
wherein transmitting the screen-casting data to the first device at least according to the first connection comprises:
and transmitting the screen throwing data to the first equipment through the socket transmission channel.
The method, preferably, obtains the screen projection data of the second device, including:
the output picture of the second equipment is subjected to image capturing through a target component so as to obtain multi-frame screen recording data frames; the target component is capable of image encoding and image decoding;
constructing a network abstract layer data frame according to the screen recording data frame;
and obtaining screen throwing data according to the network abstraction layer data frame.
An electronic device as a first device, comprising:
a display for outputting an electronic information code, the electronic information code comprising at least first information of the first device, such that a second device obtaining the electronic information code establishes a first connection with the first device at least according to the first information; the first connection is a point-to-point type connection;
the communication module is used for obtaining screen projection data of the second equipment at least according to the first connection;
and the processor is used for outputting the screen projection data through the display.
An electronic device as a second device, comprising:
the scanner is used for scanning the electronic information code output by the first equipment to obtain first information of the first equipment contained in the electronic information code;
a communication module, configured to establish a first connection with the first device according to at least the first information; the first connection is a point-to-point type connection;
and the processor is used for obtaining the screen projection data of the second equipment, and transmitting the screen projection data to the first equipment through the communication module at least according to the first connection so that the first equipment outputs the screen projection data.
As can be seen from the above solution, in the processing method and the electronic device provided by the present application, an electronic information code including first information of a first device is output on the first device, so that a second device that obtains the electronic information code can establish a first connection with the first device in a peer-to-peer manner at least according to the first information, based on this, the first device can obtain screen projection data of the second device according to the first connection and output the screen projection data, thereby implementing a screen projection service of the second device to the first device. Therefore, in the embodiment, the point-to-point type connection is established through the electronic information code, the screen-throwing service can be realized without a network handshake process or transmission of screen-throwing data through the internet, and therefore the realization complexity of the screen-throwing service is reduced.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are required to be used in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only embodiments of the present application, and that other drawings may be obtained according to the provided drawings without inventive effort to a person skilled in the art.
FIG. 1 is a flowchart of a processing method according to a first embodiment of the present disclosure;
FIG. 2 is a diagram of an example of a device in which screen projection is desired;
fig. 3, fig. 4 and fig. 5 are respectively exemplary diagrams of the present application for implementing screen projection between a mobile phone and a television;
FIGS. 6 and 7 are partial flow charts of a processing method according to a first embodiment of the present disclosure;
FIG. 8 is another flow chart of a processing method according to the first embodiment of the present application;
fig. 9 is a flowchart of a processing method according to a second embodiment of the present application;
fig. 10 is a schematic structural diagram of an electronic device according to a third embodiment of the present application;
fig. 11 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present application;
fig. 12 is a flowchart of implementing P2P connection between devices at HOST and PAD through code scanning in the present application;
FIG. 13 is a flowchart of the Client mirror image screen projection to the Host in the present application;
fig. 14 is a schematic frame structure of NALU data frame;
fig. 15 is UI video data in a standard format;
fig. 16 is a flowchart for implementing bidirectional control in a mirror image screen-projection scene in the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
Referring to fig. 1, a flowchart of an implementation of a processing method according to an embodiment of the present application may be applicable to an electronic device capable of data processing, such as the first device shown in fig. 2. The first device may be a television or a large screen, and the second device is a device that needs to throw a screen to the first device, such as a mobile phone. The technical scheme in the embodiment is mainly used for reducing the complexity of realizing the screen-throwing service among devices.
Specifically, the method in this embodiment may include the following steps:
step 101: and outputting an electronic information code, wherein the electronic information code at least comprises first information of the first device, so that a second device for obtaining the electronic information code establishes a first connection with the first device at least according to the first information.
The electronic information code may be a two-dimensional code or a bar code. The first information includes a media access control address Mac (MediaAccessControlAddress), a basic service set identifier BSSID (BasicServiceSetIdentifier), a service set identifier SSID (ServiceSet Identifier), a Password PWD (Password), and the like.
While the first connection is a point-to-point type of connection, such as a P2P connection based on a WiFi module or a Bluetooth BT (Bluetooth) module.
For example, as shown in fig. 3, a two-dimensional code is output on a screen of a first device, such as a television, where the two-dimensional code includes information, such as a Mac address, a BSSID, an SSID, and a PWD, of the television, and a second device, such as a mobile phone, may obtain the information in the two-dimensional code through a scanner, such as a camera, and then the mobile phone establishes a P2P connection with the television according to the information.
Step 102: and obtaining the screen projection data of the second equipment at least according to the first connection.
For example, as shown in fig. 4, the handset transmits the screen shot data to the television set over the P2P connection.
Step 103: and outputting the screen throwing data.
For example, as shown in fig. 4, the screen-throwing data of the mobile phone is output on the television, so that the screen-throwing service of the mobile phone to the television is realized.
As can be seen from the foregoing, in the processing method provided in the first embodiment of the present application, an electronic information code including first information of a first device is output on the first device, so that a second device that obtains the electronic information code can establish a first connection with the first device in a peer-to-peer manner according to at least the first information, and based on this, the first device can obtain screen projection data of the second device according to the first connection and output the screen projection data, thereby implementing a screen projection service of the second device to the first device. Therefore, in the embodiment, the point-to-point type connection is established through the electronic information code, the screen-throwing service can be realized without a network handshake process or transmission of screen-throwing data through the internet, and therefore the realization complexity of the screen-throwing service is reduced.
In one implementation manner, in step 102, the screen-throwing data transmitted by the second device may be received through a socket transmission channel created based on the first connection.
Specifically, the first connection is used for indicating the second device to create a socket transmission channel with the first device according to the first information. For example, as shown in fig. 5, after the P2P connection between the mobile phone and the television is established, the mobile phone creates a socket transmission channel based on the P2P connection according to the first information, and thereby, the mobile phone transmits screen-casting data to the television through the socket transmission channel, so that the screen-casting data is output on the television.
In one implementation, step 103 may be implemented when outputting the screen-cast data, as shown in fig. 6, by:
step 601: and obtaining a network abstract layer data frame corresponding to the screen throwing data according to the screen throwing data.
The network abstraction layer data frame may be a NALU (network abstractionlayerunit) data frame satisfying the H264 protocol.
For example, in this embodiment, firstly, the screen-throwing data is decompressed to obtain decompressed data packets, and then headers of the data packets are parsed, for example, payload headers of the data packets of the decompressed screen-throwing data are parsed, so as to obtain NALU data frames in the screen-throwing data.
Step 602: and analyzing the network abstract layer data frames to obtain a plurality of video image frames.
In this embodiment, the NALU data frame may be decoded and refreshed in real time for IDR (InstantaneousDecoderRefresh) frame creation to obtain a plurality of video image frames, such as I frames, P frames, B frames, etc.
Step 603: and decoding the video image frames through the target component to obtain decoded data.
Wherein the target component is a software component, such as a MediaCodec component. Based on this, in this embodiment, the target component may be called, so that the target component decodes the recording screen image frame, thereby obtaining decoded data.
It should be noted that the decoded data may be data in the format of the color coding method YUV (colorcomponentsin SECAMandPALcolorspaces).
Step 604: the decoded data is converted according to a target image format to obtain a target image frame.
The target image format may be a color mode RGB (redgreenblue) format, among other formats. For example, the decoded data in YUV format is converted into a target image frame in RGB format in the present embodiment.
Step 605: outputting the target image frame.
Specifically, in this embodiment, the target image frame may be pushed to the display screen of the first device through the web real-time communication WEBRTC (WebRealTime Communication), so as to output the target image frame.
Therefore, in this embodiment, the decoding process of the screen projection data is implemented by the software component, and the situation that decoding cannot be implemented due to incompatibility of hardware is avoided by decoupling the hardware configuration of the first device.
Further, after step 602 in the present embodiment, before step 603, the method in the present embodiment may further include the following steps, as shown in fig. 7:
step 606: the video frames are added to a pre-configured frame queue.
The frame queues are first-in first-out queues, the number of the frame queues is multiple, the number of the frame queues is determined according to the data type of the screen throwing data, and the data type is related to the data quantity of the screen throwing data.
Specifically, the data type of the screen-throwing data can be a type divided according to the data amount, such as a video playing type, a game type or a still image type; or a high data volume type, a low data volume type, etc.
Accordingly, in step 603, when decoding the video frame through the target component to obtain decoded data, the following manner may be implemented:
firstly, respectively reading the video image frames from each frame queue, and then, calling a target component to enable the target component to decode the video image frames so as to obtain decoded data.
Therefore, in the application, parallel decoding processing is realized by arranging a plurality of frame queues, so that the decoding rate is improved, low delay is realized, and the frame rate is improved.
In addition, in the embodiment, the number of the frame queues can be configured according to the data volume of the screen throwing data, so that the decoding rate can be ensured, the resource waste can be avoided, and the balance of the performance and the resource is realized.
In one implementation manner, after outputting the screen shot data in step 103 in this embodiment, the method in this embodiment may further include the following steps, as shown in fig. 8:
step 104: and obtaining a first input operation aiming at the screen throwing data through a preconfigured monitoring component.
The monitoring component may be a socketserver monitoring component, so as to monitor input operations on the first device.
Specifically, the first input operation is an operation such as clicking on the screen throwing data by a user of the first device.
Step 105: in response to the first input operation, a first instruction is executed, the first instruction causing the second device to execute a second instruction.
Specifically, the first instruction is used for transmitting the first input operation to the second device, so that the second device generates the second instruction according to the first input operation, and executes the second instruction, the second instruction is used for realizing a function corresponding to the first input operation for the screen throwing data, correspondingly, the screen throwing data on the second device are synchronously transmitted to the first device for display, and based on the first instruction, the operation on the screen throwing data on the first device can enable the second device to realize a corresponding operation function, so that bidirectional control between the first device and the second device is realized.
Or, the first instruction is a functional instruction generated by the first input operation, the first instruction is executed on the first device for the screen throwing data, and the second instruction is executed on the second device for the screen throwing data, and the second instruction is used for realizing a function corresponding to the first input operation for the screen throwing data.
Referring to fig. 9, a flowchart of an implementation of a processing method according to a second embodiment of the present application may be applicable to an electronic device capable of data processing, such as the second device shown in fig. 2. The first device may be a television or a large screen, and the second device is a device that needs to throw a screen to the first device, such as a mobile phone. The technical scheme in the embodiment is mainly used for reducing the complexity of realizing the screen-throwing service among devices.
Specifically, the method in this embodiment may include the following steps:
step 901: and scanning the electronic information code output by the first device to obtain the first information of the first device contained in the electronic information code.
Step 902: establishing a first connection with the first device based at least on the first information; the first connection is a point-to-point type of connection.
For example, as shown in fig. 3, a two-dimensional code is output on a screen of a first device, such as a television, where the two-dimensional code includes information, such as a Mac address, a BSSID, an SSID, and a PWD, of the television, and a second device, such as a mobile phone, may obtain the information in the two-dimensional code through a scanner, such as a camera, and then the mobile phone establishes a P2P connection with the television according to the information.
Step 903: and obtaining the screen projection data of the second equipment.
Specifically, in this embodiment, the screen-projection data of the second device may be obtained by:
firstly, capturing images of an output picture of second equipment through a target component to obtain multi-frame screen recording data frames; the target component is capable of image encoding and image decoding; for example, screen recording is performed on a mobile phone screen through a MediaCodec component to obtain frame data in a YUV format, then a data frame is constructed to obtain multi-frame screen recording data frames, and MediaCodec can also be used for image coding and image decoding, that is to say, screen capturing is realized through a software component for realizing image coding and image decoding in the embodiment;
then, constructing a network abstraction layer data frame, namely a NALU data frame according to the screen recording data frame; for example, in this embodiment, according to a preset encoding and decoding protocol, such as H264 protocol, a NALU data frame is constructed by using a screen recording data frame;
and finally, according to the network abstract layer data frame, obtaining the screen throwing data. For example, in this embodiment, according to the TCP/IP protocol of the first connection, a header may be encapsulated for the NALU data frame, for example, a payload may be added as the header to obtain a target data packet, and then the target data packet is compressed to obtain the screen-throwing data.
Step 904: and transmitting the screen throwing data to the first equipment at least according to the first connection so that the first equipment outputs the screen throwing data.
Specifically, in this embodiment, after the first connection is established, the second device may create, based on the first connection, a socket transmission channel with the first device according to the first information;
based on this, in step 904, the screen-casting data may be transmitted to the first device through the socket transmission channel.
After the socket transmission channel is created, the second device may also send connection confirmation information to the first device, where the connection confirmation information characterizes whether there is a historical connection between the first device and the second device; based on the above, the second device may output a screen throwing request control for a user of the second device according to the first information in the case that no history connection exists, and after receiving a second input operation for the screen throwing request control, may transmit screen throwing data to the first device according to the second input operation; and in the case of a historical connection, the second device may transmit the screen-cast data directly to the first device.
As can be seen from the above, in the processing method provided in the second embodiment of the present application, an electronic information code including first information of a first device is output on the first device, so that a second device that obtains the electronic information code can establish a first connection with the first device in a peer-to-peer manner according to at least the first information, and based on this, the first device can obtain screen projection data of the second device according to the first connection and output the screen projection data, thereby implementing a screen projection service of the second device to the first device. Therefore, in the embodiment, the point-to-point type connection is established through the electronic information code, the screen-throwing service can be realized without a network handshake process or transmission of screen-throwing data through the internet, and therefore the realization complexity of the screen-throwing service is reduced.
Referring to fig. 10, a schematic structural diagram of an electronic device according to a third embodiment of the present application is provided, where the electronic device is used as a first device, and includes the following structures:
a display 1001 for outputting an electronic information code, the electronic information code including at least first information of the first device, so that a second device that obtains the electronic information code establishes a first connection with the first device according to at least the first information; the first connection is a point-to-point type connection;
a communication module 1002, such as a WiFi module, configured to obtain, according to at least the first connection, screen-projection data of the second device;
and a processor 1003 for outputting the screen projection data through the display 1001.
As can be seen from the foregoing, in the electronic device provided in the third embodiment of the present application, an electronic information code including first information of the first device is output on the first device, so that a second device that obtains the electronic information code can establish a first connection with the first device in a peer-to-peer manner at least according to the first information, and based on this, the first device can obtain screen projection data of the second device according to the first connection and output the screen projection data, thereby implementing a screen projection service of the second device to the first device. Therefore, in the embodiment, the point-to-point type connection is established through the electronic information code, the screen-throwing service can be realized without a network handshake process or transmission of screen-throwing data through the internet, and therefore the realization complexity of the screen-throwing service is reduced.
In one implementation, the first connection is configured to instruct the second device to create a socket transmission channel with the first device according to the first information;
the communication module 1002 is specifically configured to: and receiving the screen projection data transmitted by the second equipment through the socket transmission channel.
In one implementation, the processor 1003 is specifically configured to: according to the screen throwing data, obtaining a network abstract layer data frame corresponding to the screen throwing data; analyzing the network abstraction layer data frames to obtain a plurality of screen recording image frames; decoding the screen recording image frame through a target component to obtain decoded data; converting the decoded data according to a target image format to obtain a target image frame; the target image frame is output through the display 1001.
In one implementation, the processor 1003 is further configured to: after analyzing the network abstraction layer data frame to obtain a plurality of screen recording image frames, before decoding the screen recording image frames through a target component to obtain decoded data, adding the screen recording image frames into a pre-configured frame queue; the frame queues are first-in first-out queues, the number of the frame queues is multiple, the number of the frame queues is determined according to the data type of the screen throwing data, and the data type is related to the data quantity of the screen throwing data; the processor 1003 is specifically configured to, when decoding, by the target component, the video frame to obtain decoded data: reading the screen recording image frames from each frame queue respectively; and calling a target component to enable the target component to decode the screen image frame so as to obtain decoded data.
In one implementation, after the display 1001 outputs the screen shot data, the processor 903 is further configured to: acquiring a first input operation aiming at the screen throwing data through a preconfigured monitoring component; in response to the first input operation, a first instruction is executed that causes the second device to execute a second instruction.
It should be noted that, the specific implementation of each component in this embodiment may refer to the corresponding content in the foregoing, which is not described in detail herein.
Referring to fig. 11, a schematic structural diagram of an electronic device according to a fourth embodiment of the present application, where the electronic device serves as a second device, may include the following structure:
a scanner 1101, such as a camera, for scanning an electronic information code output by a first device, so as to obtain first information of the first device included in the electronic information code;
a communication module 1102, such as a WiFi module, configured to establish a first connection with the first device according to at least the first information; the first connection is a point-to-point type connection;
the processor 1103 is configured to obtain the screen-casting data of the second device, and transmit, through the communication module 1102, the screen-casting data to the first device at least according to the first connection, so that the first device outputs the screen-casting data.
As can be seen from the foregoing, in the electronic device provided in the fourth embodiment of the present application, an electronic information code including first information of the first device is output on the first device, so that a second device that obtains the electronic information code can establish a first connection with the first device in a peer-to-peer manner at least according to the first information, and based on this, the first device can obtain screen projection data of the second device according to the first connection and output the screen projection data, thereby implementing a screen projection service of the second device to the first device. Therefore, in the embodiment, the point-to-point type connection is established through the electronic information code, the screen-throwing service can be realized without a network handshake process or transmission of screen-throwing data through the internet, and therefore the realization complexity of the screen-throwing service is reduced.
In one implementation, the communication module 1102 is further configured to, after establishing a first connection with the first device according to at least the first information: based on the first connection, creating a socket transmission channel between the socket transmission channel and the first equipment according to the first information;
the communication module 1102 is specifically configured to, when transmitting the screen projection data to the first device according to at least the first connection: and transmitting the screen throwing data to the first equipment through the socket transmission channel.
In one implementation, the processor 1103 is specifically configured to: the output picture of the second equipment is subjected to image capturing through a target component so as to obtain multi-frame screen recording data frames; the target component is capable of image encoding and image decoding; constructing a network abstract layer data frame according to the screen recording data frame; and obtaining screen throwing data according to the network abstraction layer data frame.
It should be noted that, the specific implementation of each component in this embodiment may refer to the corresponding content in the foregoing, which is not described in detail herein.
Taking a device configured with a Chrome operating system as an example, the capability of mutual collaboration between devices is limited on a device based on a ChromeOS at present. The main scene is that an android Messag application based on a mail system or based on Chrome support performs limited collaboration with mobile terminal equipment, and the client requirements projected between the equipment have no corresponding solution
For this reason, the technical scheme of this application can realize: the stable and smooth two-way screen projection of the mirror image between the Chrome devices can be realized without third-party devices, for example, two parties can project the screen from A to B at a certain moment according to requirements, the screen projection from B to A can be replaced according to requirements subsequently, and the identity switching of the screen projection side/the screen projection receiving side can be carried out in a webpage plug-in or device local functional mode. The method supports immersive full-screen projection experience, file bidirectional dragging and mutual transmission and other interconnection and intercommunication cooperation functions, occupation of system resources and authorization cost of an additional third party APP are not needed, network delay is low, the technical scheme can be updated on line in real time, and innovation and cooperation schemes such as vertical business can be added based on the scheme.
The specific scheme is as follows:
first, as shown in fig. 12, a flow chart of implementing P2P connection by scanning codes between devices at the Host end and the Client end is shown. The specific flow is as follows:
1. the Host end and the Client end (being a Chrome operating system or other types of operating systems) respectively enable a BT module or a WIFI module in the enabling equipment;
2. the Host end displays the two-dimensional code and opens a SocketServer monitoring;
3. the Client scans the two-dimensional code, and HOST end entry is provided in the two-dimensional code, for example: MAC address, BSSID, SSID, PWD, etc.;
4. the Client tries to establish P2P (PeertoPeer) connection;
5. P2P connection is successful;
6. the two-dimensional code provides a P2P entry IP, and the Client creates a socket;
7. the Client end is connected with the HOST end;
8. the Host end shows that the code scanning is successful;
9. the Client sends ACK (Acknowledge) an acknowledgement to HOST. Such as: flag, product number PID (ProductID) inside the manufacturer, manufacturer number VID (VenderID), and the like. And confirms whether it is a new device, i.e. whether to memorize the Host end;
10. under the condition that the Client terminal is a new device of the Host terminal, the Client terminal pops up a confirmation box: accepting (sending message to Host: connectstatus: 1) or rejecting (sending message to Host: connectstatus: 0), and realizing screen-throwing data transmission under the condition of user acceptance.
As shown in fig. 13, a flow chart of mirror image screen projection from Client to Host is shown, and the specific flow is as follows:
1. client:
1.1, starting screen recording, namely projecting the whole UI mirror image of the Client end to the Host end to achieve the overall synergistic effect, realizing file bidirectional interaction, having no account and third party App limitation, and having no need of realizing low delay projection through the world Wide Web, and being different from the technical characteristics of video stream transmission by other technical schemes.
1.2, continuously recording screen frame data through a MediaCodec component;
1.3, splicing the frame data obtained by screen recording with a prefix of 0x00000001 (Flag) to form NALU data frame of H264. The frame structure of NALU data frames is as shown in fig. 14:
1.3.1 constructing NALU units, wherein one NALU consists of a set of NALU header information corresponding to video coding, i.e. NALheader, and one original byte sequence payload RBSP (RawByteSequence Payload), expressed in four bytes startcode (0 x 00000001).
As shown in fig. 15, UI video data in a standard format includes a sequence parameter set SPS (SequenceParamaterSet) and an image parameter set PPS (Picture ParamaterSet) in addition to I frames, P frames, and B frames, wherein each NALU is a frame structure as shown in fig. 13. In this embodiment, the value of the policy affecting the coding quality GOP (GroupOfPicture) may be increased, that is, the reference period reference is increased, where the reference period specifically refers to the distance between two P frames, so as to ensure that the number of B frames between two P frames is sufficiently large, and ensure the image quality.
Wherein, the I frame (BFRAME) is an intra-frame (intra), the P frame (PFrame) is a forward predictive-frame (predictive-frame), and the B Frame (BFRAME) is a bi-predictive frame (bi-directional interpolatedpredictionframe).
And 1.4, the NALU spliced channel protocol data form TCP (TransmissionControlProtocol) payload data of the Host end, and the payload data is transmitted to the Host end after being packed.
2. The Host end:
2.1, unpacking data, namely unpacking screen-throwing data of the pressed package transmitted by the Client;
analyzing the TCP data of the decompressed data packet, and filtering out H264 frame data, namely NALU data frames;
2.3, constructing IDR frames, such as PPS (PictureParameterSet), SPS (Sequence ParameterSet), I frames, P frames, B frames and the like, according to NALU, and then entering a frame queue;
2.4, pre-creating a frame queue of first-in first-out FIFO (FirstInFirstOut) to realize concurrent frame processing, improving the video stream frame rate to 1080p@60hz, and improving the decoding rate to realize low delay:
A. setting the capacity maxsize as 10 frames in a frame queue, wherein the capacity can be dynamically created and released;
B. if the queue is full, the frame is not put into the queue and is directly lost, so that the performance is ensured;
C. after the decoding side finishes decoding one frame, when the queue has a vacant position, a new frame can be enqueued to realize high concurrency;
2.5, decoding frame data in a frame queue to obtain YUV;
2.6, converting YUV into pixel image in RGB format;
and 2.7, the WebRTC (or Render) thread pushes the display mirror image to realize the mirror image screen projection.
It should be noted that, in this embodiment, the operating system based on Chrome may implement flow control through progressive web application PWA (ProgressiveWebApp) language. When the present application is applicable to devices of other operating systems, flow control may be implemented in a language applicable to the corresponding devices and operations.
In addition, the mirror image screen projection interface in the embodiment can also realize bidirectional control. As shown in fig. 16, the specific flow is as follows:
at a Host end of a configured Chrome operating system, processing received screen throwing data of a Client end through a network service module at a system level, processing the screen throwing data to obtain a pixel image in an RGB format through the screen throwing service module, outputting the pixel image of RGB to an upper layer application through a WebRTC, outputting a mirror image screen throwing interface through the upper layer application, interacting through the interface, monitoring an interaction event through an event monitoring and service module at the system level, and synchronizing the interaction event with the Client end through the screen throwing service module and the network service module, thereby realizing bidirectional control between the Client end and the Host end.
It should be noted that, in the present specification, each embodiment is described in a progressive manner, and each embodiment is mainly described as different from other embodiments, and identical and similar parts between the embodiments are all enough to be referred to each other.
For convenience of description, the above system or apparatus is described as being functionally divided into various modules or units, respectively. Of course, the functions of each element may be implemented in one or more software and/or hardware elements when implemented in the present application.
From the above description of embodiments, it will be apparent to those skilled in the art that the present application may be implemented in software plus a necessary general purpose hardware platform. Based on such understanding, the technical solutions of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product, which may be stored in a storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, etc., including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to perform the methods described in the embodiments or some parts of the embodiments of the present application.
Finally, it is further noted that relational terms such as first, second, third, fourth, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The foregoing is merely a preferred embodiment of the present application and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present application and are intended to be comprehended within the scope of the present application.

Claims (10)

1. A processing method applied to a first device, the method comprising:
outputting an electronic information code, wherein the electronic information code at least comprises first information of the first equipment, so that a second equipment for obtaining the electronic information code establishes a first connection with the first equipment at least according to the first information; the first connection is a point-to-point type connection;
obtaining screen projection data of the second equipment at least according to the first connection;
and outputting the screen throwing data.
2. The method of claim 1, the first connection to instruct the second device to create a socket transmission channel with the first device from the first information;
the screen projection data of the second device is obtained at least according to the first connection, and the screen projection data comprises:
and receiving the screen projection data transmitted by the second equipment through the socket transmission channel.
3. The method of claim 1, outputting the screen-cast data, comprising:
according to the screen throwing data, obtaining a network abstract layer data frame corresponding to the screen throwing data;
analyzing the network abstraction layer data frames to obtain a plurality of screen recording image frames;
decoding the screen recording image frame through a target component to obtain decoded data;
converting the decoded data according to a target image format to obtain a target image frame;
outputting the target image frame.
4. The method of claim 3, after parsing the network abstraction layer data frame to obtain a plurality of video frames, prior to decoding the video frames by a target component to obtain decoded data, the method further comprising:
adding the screen recording image frames to a pre-configured frame queue; the frame queues are first-in first-out queues, the number of the frame queues is multiple, the number of the frame queues is determined according to the data type of the screen throwing data, and the data type is related to the data quantity of the screen throwing data;
wherein decoding, by the target component, the video-recording image frame to obtain decoded data includes:
reading the screen recording image frames from each frame queue respectively;
and calling a target component to enable the target component to decode the screen image frame so as to obtain decoded data.
5. The method of claim 1, after outputting the screen shot data, the method further comprising:
acquiring a first input operation aiming at the screen throwing data through a preconfigured monitoring component;
in response to the first input operation, a first instruction is executed that causes the second device to execute a second instruction.
6. A processing method applied to a second device, the method comprising:
scanning an electronic information code output by a first device to obtain first information of the first device contained in the electronic information code;
establishing a first connection with the first device at least according to the first information; the first connection is a point-to-point type connection;
obtaining screen projection data of the second equipment;
and transmitting the screen projection data to the first equipment at least according to the first connection so that the first equipment outputs the screen projection data.
7. The method of claim 6, after establishing a first connection with the first device based at least on the first information, the method further comprising:
based on the first connection, creating a socket transmission channel between the socket transmission channel and the first equipment according to the first information;
wherein transmitting the screen-casting data to the first device at least according to the first connection comprises:
and transmitting the screen throwing data to the first equipment through the socket transmission channel.
8. The method of claim 6 or 7, obtaining the screen shot data of the second device, comprising:
the output picture of the second equipment is subjected to image capturing through a target component so as to obtain multi-frame screen recording data frames; the target component is capable of image encoding and image decoding;
constructing a network abstract layer data frame according to the screen recording data frame;
and obtaining screen throwing data according to the network abstraction layer data frame.
9. An electronic device as a first device, comprising:
a display for outputting an electronic information code, the electronic information code comprising at least first information of the first device, such that a second device obtaining the electronic information code establishes a first connection with the first device at least according to the first information; the first connection is a point-to-point type connection;
the communication module is used for obtaining screen projection data of the second equipment at least according to the first connection;
and the processor is used for outputting the screen projection data through the display.
10. An electronic device as a second device, comprising:
the scanner is used for scanning the electronic information code output by the first equipment to obtain first information of the first equipment contained in the electronic information code;
a communication module, configured to establish a first connection with the first device according to at least the first information; the first connection is a point-to-point type connection;
and the processor is used for obtaining the screen projection data of the second equipment, and transmitting the screen projection data to the first equipment through the communication module at least according to the first connection so that the first equipment outputs the screen projection data.
CN202310345018.4A 2023-03-31 2023-03-31 Processing method and electronic equipment Pending CN116360724A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310345018.4A CN116360724A (en) 2023-03-31 2023-03-31 Processing method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310345018.4A CN116360724A (en) 2023-03-31 2023-03-31 Processing method and electronic equipment

Publications (1)

Publication Number Publication Date
CN116360724A true CN116360724A (en) 2023-06-30

Family

ID=86920023

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310345018.4A Pending CN116360724A (en) 2023-03-31 2023-03-31 Processing method and electronic equipment

Country Status (1)

Country Link
CN (1) CN116360724A (en)

Similar Documents

Publication Publication Date Title
US8681197B2 (en) Communication system, terminal apparatus and computer program
KR100465553B1 (en) Video voice decoding device, video voice coding device and information transmission system
US20210377330A1 (en) Low-latency video internet streaming for management and transmission of multiple data streams
US8214708B2 (en) Video transmitting apparatus, video receiving apparatus, and video transmission system
US20220329883A1 (en) Combining Video Streams in Composite Video Stream with Metadata
US8872886B2 (en) Method, apparatus, and system for establishing multi-cascade channel
CN114040232B (en) Screen projection system, screen projection method, electronic equipment and storage medium
KR101539812B1 (en) Moving-picture image data-distribution method
CN114221909B (en) Data transmission method, device, terminal and storage medium
CN107547517B (en) Audio and video program recording method, network equipment and computer device
JP4358129B2 (en) TV conference apparatus, program, and method
CN110505441B (en) Visual communication method, device and system
CN109862400B (en) Streaming media transmission method, device and system
EP2563038A1 (en) Method for transmitting video signals from an application on a server over an IP network to a client device
CN105049955A (en) Real-time screen transferring method and system
CN116360724A (en) Processing method and electronic equipment
CN116962613A (en) Data transmission method and device, computer equipment and storage medium
JP4277365B2 (en) Video distribution system
CN113542857A (en) Screen projection method and device, projection equipment, screen projection playing equipment and storage medium
US20100208790A1 (en) Method and System for Reducing the Bit Stream and Electronic Device Thereof
CN110012307A (en) Video transmission method, device and main terminal equipment
US20230034162A1 (en) Transmission apparatus and transmission method
EP2417766A1 (en) Method and apparatus for asynchronous video transmission over a communication network
CN101814969A (en) Method and system for reducing bit stream and electronic device
JP2000224569A (en) Monitoring device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination