CN112383803A - Information processing method and related device - Google Patents
Information processing method and related device Download PDFInfo
- Publication number
- CN112383803A CN112383803A CN202011281262.1A CN202011281262A CN112383803A CN 112383803 A CN112383803 A CN 112383803A CN 202011281262 A CN202011281262 A CN 202011281262A CN 112383803 A CN112383803 A CN 112383803A
- Authority
- CN
- China
- Prior art keywords
- electronic device
- target content
- target
- application
- content
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 67
- 238000003672 processing method Methods 0.000 title claims abstract description 18
- 238000000034 method Methods 0.000 claims abstract description 60
- 238000004891 communication Methods 0.000 claims description 31
- 238000004590 computer program Methods 0.000 claims description 8
- 238000001514 detection method Methods 0.000 claims description 2
- 230000009286 beneficial effect Effects 0.000 abstract 1
- 238000010586 diagram Methods 0.000 description 25
- 230000006870 function Effects 0.000 description 20
- 238000012545 processing Methods 0.000 description 18
- 230000008569 process Effects 0.000 description 12
- 230000005540 biological transmission Effects 0.000 description 6
- 230000000977 initiatory effect Effects 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 5
- 230000003993 interaction Effects 0.000 description 5
- 230000007246 mechanism Effects 0.000 description 5
- 239000000243 solution Substances 0.000 description 5
- 238000009877 rendering Methods 0.000 description 4
- 235000019800 disodium phosphate Nutrition 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000012546 transfer Methods 0.000 description 3
- 238000005266 casting Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 238000010295 mobile communication Methods 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 238000000926 separation method Methods 0.000 description 2
- 230000011664 signaling Effects 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000005538 encapsulation Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 239000003999 initiator Substances 0.000 description 1
- 238000002347 injection Methods 0.000 description 1
- 239000007924 injection Substances 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000004806 packaging method and process Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/4104—Peripherals receiving signals from specially adapted client devices
- H04N21/4122—Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/24—Querying
- G06F16/245—Query processing
- G06F16/2458—Special types of queries, e.g. statistical queries, fuzzy queries or distributed queries
- G06F16/2471—Distributed queries
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1454—Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
- H04N21/4312—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
- H04N21/4316—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/443—OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB
- H04N21/4438—Window management, e.g. event handling following interaction with the user interface
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Fuzzy Systems (AREA)
- Mathematical Physics (AREA)
- Probability & Statistics with Applications (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Business, Economics & Management (AREA)
- Marketing (AREA)
- Information Transfer Between Computers (AREA)
Abstract
The embodiment of the application discloses an information processing method and a related device, comprising the following steps: the first electronic device outputs target content from the second electronic device; the first electronic device receives updated target content from the second electronic device, wherein the updated target content is obtained by the second electronic device executing the following operations: receiving control information aiming at target content from the target equipment, updating the target content according to the control information, and sending the updated target content to the first electronic equipment; and the first electronic equipment outputs the updated target content. The method and the device are beneficial to improving comprehensiveness and flexibility of content display and control in the multi-device interconnection scene.
Description
Technical Field
The present application relates to the field of information processing technologies, and in particular, to an information processing method and a related apparatus.
Background
In the existing multi-device interconnection scheme in application scenes such as smart homes and the like, the display of the same content by a plurality of devices is generally realized based on a screen projection mechanism, for example, a mobile phone projects streaming media information of a mobile phone end to a television end for playing through a screen projection service. The screen projection service with single function is increasingly difficult to meet the diversified use requirements of users at present.
Disclosure of Invention
The embodiment of the application provides an information processing method and a related device, aiming to improve comprehensiveness and flexibility of content display and control in a multi-device interconnection scene.
In a first aspect, an embodiment of the present application provides an information processing method, including:
the first electronic device outputs target content from the second electronic device;
the first electronic device receives the updated target content from the second electronic device, wherein the updated target content is obtained by the second electronic device by performing the following operations: receiving control information aiming at the target content from a target device, updating the target content according to the control information, and sending the updated target content to the first electronic device;
and the first electronic equipment outputs the updated target content.
It can be seen that, in this example, while the first electronic device can display the target content of the second electronic device, the target content updated by the second electronic device according to the control information for the target content can be dynamically displayed in a following manner, and the target device entering the control information may be the first electronic device itself or another device, so that the process of displaying the target content by the first electronic device is more consistent in effect with the process of running a local application by the first electronic device itself to display the content, thereby ensuring user experience and improving comprehensiveness and flexibility of content display and control in a multi-device interconnection scene.
In a second aspect, an embodiment of the present application provides an information processing method, including:
the first electronic device outputs target content from the second electronic device, wherein the target content is information which is not actually output by the second electronic device in a state of running a target application.
As can be seen, in this example, the first electronic device can display information that is not actually output by the second electronic device in a state where the target application is running, that is, although the target application is not installed at the home terminal, the first electronic device can output, through a capability device such as a display screen, a perceivable target content that the target application needs to be output to the outside in the running state, thereby implementing cross-terminal use of the target application, and improving flexibility and comprehensiveness of application data display in a multi-device interconnection scenario.
In a third aspect, an embodiment of the present application provides an information processing method, including:
the method comprises the steps that a first electronic device detects a trigger operation of transferring target content to a second electronic device, wherein the target content is application data generated by a target application in an operating state;
the first electronic device sends the target content to the second electronic device, and the target content is used for the second electronic device to output the target content.
Therefore, in this example, the electronic device can switch the application data generated by the target application in the running state among the multiple devices in a streaming manner, so that cross-terminal output of the application data in a multi-device interconnection scene is realized, and the use flexibility and comprehensiveness of the application data in the multi-device interconnection scene are improved.
In a fourth aspect, an embodiment of the present application provides an information processing apparatus, including:
an output unit for outputting the target content from the second electronic device;
a receiving unit, configured to receive the updated target content from the second electronic device, where the updated target content is obtained by the second electronic device performing the following operations: receiving control information aiming at the target content from a target device, updating the target content according to the control information, and sending the updated target content to the first electronic device;
and the output unit is used for outputting the updated target content.
In a fifth aspect, an embodiment of the present application provides an information processing apparatus, including:
and the output unit is used for outputting target content from the second electronic equipment, wherein the target content is information which is not actually output by the second electronic equipment in a state of running the target application.
In a sixth aspect, an embodiment of the present application provides an information processing apparatus, including:
the detection unit is used for detecting a trigger operation of transferring target content to second electronic equipment, wherein the target content is application data generated by a target application in an operating state;
a sending unit, configured to send the target content to the second electronic device, where the target content is used for the second electronic device to output the target content.
In a seventh aspect, an embodiment of the present application provides an electronic device, including a processor, a memory, a communication interface, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the processor, and the program includes instructions for executing the steps of any of the methods in the first to third aspects of the embodiments of the present application.
In an eighth aspect, an embodiment of the present application provides a chip, including: and the processor is used for calling and running the computer program from the memory so that the equipment provided with the chip executes part or all of the steps described in any one of the methods of the first aspect and the third aspect of the embodiment of the application.
In a ninth aspect, the present application provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program for electronic data exchange, where the computer program makes a computer perform part or all of the steps described in any one of the methods in the first to third aspects of the present application.
In a tenth aspect, the present application provides a computer program, where the computer program is operable to make a computer perform some or all of the steps described in any of the methods in the first to third aspects of the present application. The computer program may be a software installation package.
Drawings
Reference will now be made in brief to the drawings that are needed in describing embodiments or prior art.
Fig. 1A is a schematic flowchart of an information processing method provided in an embodiment of the present application;
fig. 1B is a schematic structural diagram of a multi-device unified management system according to an embodiment of the present application
Fig. 1C is a schematic diagram illustrating that multiple devices cooperatively use the same application in a one-to-one mode according to an embodiment of the present disclosure;
fig. 1D is a schematic diagram illustrating that multiple devices cooperatively use the same application in another one-to-one mode according to the embodiment of the present application;
fig. 1E is a schematic diagram illustrating an interface of a single device showing multiple devices running different types of applications in a many-to-one mode according to an embodiment of the present application;
fig. 1F is a schematic diagram of an interface, where a single device displays multiple devices running the same type of application in a many-to-one mode according to an embodiment of the present application;
FIG. 1G is a schematic diagram illustrating an application interface of a pair of multimodal devices showing the same device according to an embodiment of the present application;
fig. 1H is a schematic diagram of multiple devices cooperatively operating a same application in an MVC mode according to an embodiment of the present application;
fig. 2A is a schematic flowchart of an information processing method according to an embodiment of the present application;
fig. 2B is a schematic diagram of an application interface for displaying, by another device, a single application running in the background of a current device according to an embodiment of the present application;
fig. 2C is a schematic diagram of an application interface for displaying, by another device, a plurality of applications currently running on the device according to an embodiment of the present application;
fig. 3A is a schematic flowchart of an information processing method according to an embodiment of the present application;
FIG. 3B is a schematic diagram illustrating a cross-device streaming of target content according to an embodiment of the present disclosure;
fig. 3C is a schematic diagram of an application layer streaming media redirection architecture according to an embodiment of the present application;
fig. 3D is a data flow diagram of application layer streaming media redirection provided by an embodiment of the present application;
FIG. 4 is a block diagram of functional units of information processing provided in an embodiment of the present application;
FIG. 5 is a schematic structural diagram of an information process provided in an embodiment of the present application;
FIG. 6 is a block diagram of functional units for processing another information provided in an embodiment of the present application;
FIG. 7 is a schematic structural diagram of another information processing provided in an embodiment of the present application;
FIG. 8 is a block diagram of functional units for processing another information provided in an embodiment of the present application;
fig. 9 is a schematic structural diagram of another information processing provided in an embodiment of the present application.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description and claims of the present application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
At present, an end-to-end P2P network is constructed through wireless high-fidelity Wi-Fi equipment in a television, the television is defined as a P2P equipment group management end, and a mobile terminal is defined as a P2P equipment group client; and starting a miracast protocol, and when the television receives screen casting requests of one or more mobile terminals, casting screen contents of the one or more mobile terminals to a display screen of the television.
The mode can only meet the requirement of unidirectional screen projection playing on the display layer, and is difficult to meet the increasingly rich application requirement of users.
In view of the above problem, an embodiment of the present application provides an information processing method, which is described in detail below with reference to the accompanying drawings.
Referring to fig. 1A, fig. 1A is a schematic flowchart of an information processing method provided in an embodiment of the present application, and is applied to any electronic device supporting a display function in a target device set, where the target device set is a device set composed of one or more devices under the same user account, and as shown in the figure, the method includes:
in step 101, a first electronic device outputs a target content from a second electronic device.
The target content may be data of a specific file type in the second electronic device, such as a document, a photo, a video, music, and the like, and the first electronic device may output the corresponding content through an application capable of locally reading the files, such as playing an image through a video file read by a video player exclusively, or playing music through a music player reading a corresponding music file.
In addition, the target content may also be application data of a target application in the second electronic device in an operating state, such as UI data, audio data, vibration data, and the like of a game application, and the first electronic device may implement output of the content through any one of a distributed application mechanism, a locally compatible application mechanism, and a redirection application mechanism (specifically, redirection based on control binding, redirection based on partial screen projection, redirection from a large screen to a small screen, and a universal screen projection manner).
In specific implementation, a unified desktop is displayed on the first electronic device, the existing metadata on the unified desktop, the metadata are associated with the data of the same file type of the device or the application of the associated device, and the metadata are implemented by a server + terminal multi-device management system.
One metadata consists of:
1. and displaying pictures and icons on the unified desktop.
2. Name, name displayed on the unified desktop.
3. A unique path, a unique path that can access this metadata.
The unique path has a unique device ID and a unique path for metadata under this device.
4. Metadata categories: file type (e.g., metadata for music, video, documents, photos, etc.), application type (e.g., metadata for device level, metadata for aggregate application level, metadata for individual application level).
5. Each data has its own properties such as the size of the picture, the duration of the video, etc.
These attributes of the metadata enable the unified desktop to present these data well and to find a suitable opening when the user triggers the use.
In a specific implementation, as shown in the schematic diagram of the architecture of the multi-device unified management system shown in fig. 1B, the unified desktop can be specifically implemented by a server and a terminal.
The server (shown as a cloud system) is mainly responsible for management of devices under a user name and management of applications and data, and specifically may include a device management module, an application management module, and a user management module, where any two of the device management module, the application management module, and the user management module are in communication connection with each other.
The terminal (shown as device a, device B, and device C) includes a unified desktop system and/or a background support system, the unified desktop system includes at least one of a user interface UI module, a redirection receiving end Sink module, an application decision module, and an own application module, and the background support system includes at least one of a connection module, a distributed data service system, a distributed resource system, and a redirection originating end source system.
The unified desktop system is a main interactive inlet, and a user can realize unified management and use of multiple devices through the system.
The connection module is used for ensuring the connection of multiple devices, and specifically can support various conventional communication technologies, such as a mobile communication technology (fifth generation mobile communication technology 5G, etc.), a local area network short-distance wireless communication technology (bluetooth, ultra-wideband UWB, etc.).
The distributed data service system is used for realizing the intercommunication of data among a plurality of devices.
The redirection source system and the distributed resource system are used for realizing cross-end use of the application.
Wherein different devices may be provided with both a unified desktop system and a back-end support system, or only one of them, depending on the capabilities of the device.
For example, the rich-capability devices such as a mobile phone and a PC may have a unified desktop system and a background support system at the same time, the customer premise equipment CPE, a router, etc. may have only a background support system, and the web end may have only a unified desktop system.
Wherein the target content is at least one of the following information actually output by the second electronic device: audio, video, pictures, text, motion information (e.g., vibration information).
wherein the target device includes the first electronic device or a third electronic device other than the first electronic device and the second electronic device.
According to the difference of the number of devices on two sides of a flow, the flow can be divided into a one-to-one mode, a many-to-one mode, a one-to-many mode and an MVC mode (model-view-controller, M refers to a business model, V refers to a user interface, and C refers to a controller, and the MVC is used for separating the implementation codes of M and V, so that different expressions can be used by the same program.
The following description is made with reference to examples.
For example, as shown in fig. 1C, in the one-to-one mode, if the mobile phone and the PC are interconnected, the video application runs on the mobile phone, and the video application is not installed on the PC, the PC may share the display and play capability of the video application at the mobile phone end, that is, the game images and sound streams at the mobile phone end are transferred to the PC end for playing, so as to implement the cross-device use of the video application.
For another example, as shown in fig. 1D, in the one-to-one mode, the lovers play the same game on two mobile phones in cooperation, so as to subvert the playing method of the traditional single-player game, and expand the single-player game into a multi-player game through the multi-device cooperation capability provided by the embodiment of the application, thereby bringing innovation in playing and experience. Namely, the device a runs the game application a, the device B transfers the picture and sound stream of the game application a run by the device a to the local terminal device for playing, the operation information on the device B can be transferred back to the device a for response, and the picture information and sound information after response are transferred back again, thereby realizing the transfer of application data and control information.
For another example, in the many-to-one mode, multiple paths of audio and video streams are presented on the same large-screen device together, where the sources of the multiple paths of audio and video streams are not limited, and may be from different devices, or may be from multiple physical screens of the same device, or may be from multiple virtual screens of the same device, or even from the large-screen device itself. As shown in fig. 1E, the pictures of the mobile phone 1, the mobile phone 2, the mobile phone 3 and the tablet in the figure are all displayed on the large screen of the television.
For another example, as shown in fig. 1F, in the many-to-one mode, a multi-person screen projection product from a mobile phone to a TV adopts the many-to-one mode, and at most four pictures are simultaneously displayed. For the presentation of multiple screens, a dynamic layout is also generated according to the display state of the current input source. In the case where the user uses the application 1 in common, the illustration is 4, and application screens displayed by the respective user terminals are all displayed on the same screen.
For another example, as shown in FIG. 1G, in a pair of multiple modes, it can be subdivided into two dimensions, a device level and an application level. The content of the audio and video stream distributed by multiple paths is the same, so in the mode, the collection and the coding of the audio and video are multiplexed, and only the multiple paths are split during stream distribution.
The one-to-many mode can be used for a video conference type scene, audio and video information of a certain device (namely a redirection initiating terminal) can be simultaneously redirected to a plurality of terminals of a mobile phone/PC/TV, and each redirection receiving terminal can independently configure the display/counter control authority of the redirection receiving terminal.
For another example, as shown in fig. 1H, in the MVC mode, the modes of controlling, displaying and data participating in three parties realize separation of displaying/running/controlling, so that the advantages of each device are fully exerted, and multiple parties cooperate to complete a cross-end task. In the legend, the television shows the display capability of a large screen, the PC exerts the advantages of the control of a keyboard and a mouse, and the mobile phone provides application, data and computing power, so that better cross-end game experience is brought to users. (for controlled injection, a peripheral such as a Bluetooth handle may also be used.) the following data interaction steps may be included in the scenario.
(1) The PC queries the handset 1 for distribution data.
(2) The handset 1 provides the distribution data to the PC.
(3) The PC pushes the connection configuration to the television.
(4) The television requests the mobile phone 1 for image connection.
(5) The handset 1 provides a video stream to the television.
(6) The PC sends control information to the handset 1.
It can be seen that, in the embodiment of the present application, while the first electronic device can display the target content of the second electronic device, the target content updated by the second electronic device according to the control information for the target content can be dynamically displayed in a following manner, and the target device entering the control information may be the first electronic device itself or another device, so that the process of displaying the target content by the first electronic device is more consistent in effect with the process of running a local application by the first electronic device itself to perform content display, thereby ensuring user experience and improving comprehensiveness and flexibility of content display and control in a multi-device interconnection scene.
Referring to fig. 2A, fig. 2A is a schematic flowchart of an information processing method provided in an embodiment of the present application, and the method is applied to any electronic device supporting a display function in a target device set, as shown in the figure, the method includes:
in step 201, a first electronic device outputs a target content from a second electronic device, where the target content is information that is not actually output by the second electronic device in a state of running a target application.
The target application is an application running in the background.
Wherein the target content includes at least one of the following information: audio, video, pictures, text, motion information (e.g., vibration information).
The embodiment of the application provides a concept of application-level screen projection, aiming at application-level screen projection, adapting to the operation habit of a PC, transmitting data between applications on a mobile phone and not influencing the use of the current mobile phone. Different from the traditional mirror image screen projection, the mode supports background multi-channel screen projection and expands the mobile phone from a single task mode to a multi-task mode. And the display resolution of the client can be self-adapted, and application-level reverse control is supported. The specific implementation mechanism of the application-level screen projection is to implement distributed management of the application program at the granularity level of services (such as various services including a display service, a sound service, a positioning service, an input service, a payment service and the like).
For example, as shown in fig. 2B, in the background mode, the game background is projected onto the PC for display and control, and the game is played while the WeChat is refreshed, which are not interfered with each other. In the figure, the mobile phone 1 operates the application 1 as a redirection initiating terminal foreground, displays a picture 1 of the application 1, operates the application 2 in a background, sends an audio/video stream to the PC, and the PC operates as a redirection receiving terminal, displays a picture 2 of the application 2 on a screen, and can send a control stream to the mobile phone 1.
For example, as shown in fig. 2C, in the multi-window mode, multiple applications are simultaneously opened on a large-screen device in a multi-window manner, improving office efficiency and entertainment experience. In the figure, the mobile phone 1 runs the application 1, the application 2 and the application 3 as a redirection initiating end, and the PC end simultaneously presents the application 1 picture, the application 2 picture and the application 3 picture on the screen as a redirection receiving end.
The support for various complex intelligent interconnection scenes embodies the flexibility of the scheme of the embodiment, and the customized interconnection scenes can be easily assembled by configuration and secondary development for terminal users or upper-layer application developers.
It can be seen that, in the embodiment of the application, the first electronic device can display information that is not actually output by the second electronic device in a state of running the target application, that is, although the target application is not installed at the home terminal, the first electronic device can output the sensible target content that needs to be output externally by the target application in the running state through a capability device such as a display screen, so that cross-terminal use of the target application is realized, and flexibility and comprehensiveness of application data display in a multi-device interconnection scene are improved.
In one possible example, the target application includes a plurality of applications; the first electronic device outputs target content from a second electronic device, including: and the first electronic equipment displays the interface data of the plurality of applications in a multi-window mode.
Wherein, the plurality of applications can be the same type or different types of applications, and are not limited herein.
Therefore, in this example, the first electronic device can display interfaces of multiple applications run by other devices, so that cross-terminal multiple applications can be used, and convenience and flexibility of application use in a multi-device interconnection scene are improved.
Referring to fig. 3A, fig. 3A is a schematic flowchart of an information processing method provided in an embodiment of the present application, and is applied to any electronic device supporting a display function in a target device set, where as shown in the figure, the method includes:
wherein the triggering operation comprises any one of the following: touch control operation, gesture operation, voice operation, close range wireless communication (such as Near Field Communication (NFC) communication and ultra-wideband (UWB) communication); the target content includes at least one of the following information: audio, video, pictures, text, motion information (e.g., vibration information).
When the target content is a video, specifically, the flow of a screen picture is referred, the modes of the flow between the devices are mainly divided into two modes, one mode is that the application flows from the source device to the target device, the application can keep the current session state uninterrupted in the process, the other mode is the flow between the two target devices, the session state of the original target device can be interrupted in the process, namely, the interface of the application is not displayed, and the triggering mode of the flow includes but is not limited to NFC, UWB, gestures, voice and the like.
Wherein after the first electronic device sends the target content to the second electronic device, the method further comprises: and the first electronic equipment cancels the display of the target content.
As shown in fig. 3B, the mobile phone 1 is a redirection initiator, the picture 1 of the application 1 is run, the PC and the television are both target devices, the picture 1 of the mobile phone 1 is streamed to the PC, the application interface of the mobile phone 1 is maintained, and when the PC streams the picture to the television, the picture of the PC may not be displayed any more after the television successfully receives and displays the picture of the application 1.
Therefore, in the embodiment of the application, the electronic device can switch the application data generated by the target application in the running state among the multiple devices in a circulating manner, so that cross-terminal output of the application data in a multi-device interconnection scene is realized, and the use flexibility and the comprehensiveness of the application data in the multi-device interconnection scene are improved.
In one possible example, the first electronic device sending the target content to the second electronic device includes: the first electronic device indirectly sends the target content to the second electronic device through a multi-device management server; or, the first electronic device directly sends the target content to the second electronic device through a preset communication network.
As can be seen, in this example, when the first electronic device is used as a streaming source device, the first electronic device may transmit data with a streaming target device in a direct or indirect manner to implement cross-device output of content, so as to improve flexibility and convenience of content use in a multi-device interconnection scenario.
In one possible example, the target content is application data generated by a target application running on the third electronic device in a running state; the first electronic device sends the target content to the second electronic device, and the target content comprises: the first electronic device transmits a streaming request to the third electronic device, the streaming request including device information of the second electronic device as a content receiving device, the streaming request being used for the third electronic device to perform the following operations: indirectly sending the target content to the second electronic device through a multi-device management server; or, the target content is directly sent to the second electronic device through a preset communication network.
As can be seen, in this example, when the first electronic device is used as a flow initiating device and is not a source device of the target content, the first electronic device may transmit data with the content receiving device in a server transfer or direct communication manner to implement cross-device output of the content, so as to improve flexibility and convenience of content use in a multi-device interconnection scenario.
The technical principle underlying the embodiments of the method described above is explained below.
As shown in fig. 3C, the hierarchical model specifically includes:
1. an application layer: and the system is responsible for providing redirection service, UI interaction and screen projection effect presentation for the outside.
2. And (3) a service layer: and packaging and combining the lower-layer functional modules to provide various services with higher abstraction for the application layer.
3. A redirection layer: and the system is responsible for collecting and playing audio and video data, and controlling the capture, processing and reverse control of the data.
4. And coding and decoding layers: and is responsible for the coding and decoding of the streaming media data.
5. A data channel layer: a service data channel is provided upwards, and the binary data stream is encapsulated into data frames with service logic, such as video frames, audio frames and control signaling.
6. A transmission layer: the transmission of the binary data stream is based on a standard or custom network communication protocol.
The core function module specifically includes:
1. redirection Manager (redirection Manager, illustrated as redirection management): for providing various redirection services.
2. Connection Manager (Connection Manager, illustrated as Connection management): the method is used for establishing a connection channel and supporting a plurality of connection modes and transmission protocols.
3. Device Manager (Device Manager, illustrated as Device management): for managing device information and for software and hardware capability negotiation.
4. Status Manager (Status Manager, not shown in the figure): the system is used for monitoring and synchronizing various normal and abnormal states of the screen.
5. Scene Manager (Scene Manager, not shown in the figure): for taking charge of scene management in two dimensions. Deploying a layer: configuring and deploying various interconnected screen projection scenes; an application layer: customized start-up strategies are employed for different applications.
6. Video Sampler & Redirector (illustrated as Video capture redirection): the video recorder is used for being responsible for video stream acquisition, and adopts screen recording and application level display redirection at an equipment level.
7. Video rendering (Video Renderer, illustrated as Video rendering): and the video rendering and displaying module is used for preprocessing and merging rendering by using OpenGL primitives in multipath presentation.
8. Control collection, transport (Control Sampler & Handler, shown as Control transport and Control collection): the system is used for acquisition, operation mapping and distribution execution responsible for reverse control, and also comprises two levels of an application level and a device level.
9. Video Encoder & Decoder (illustrated as Video encoding and Video decoding): and the video coding and decoding module is used for taking charge of relevant work of video coding and decoding, and the video coding format mainly adopts AVC (H264) and HEVC (H265).
10. Audio encoding and decoding (Audio Encoder & Decoder, illustrated as video Audio encoding and Audio decoding): and the audio coding and decoding module is used for taking charge of relevant work of audio coding and decoding, and the audio coding format mainly adopts PCM original code stream and AAC.
11. Video/Audio/Control Channel (Video/Audio/Control Channel, shown as Video Channel, Audio Channel, Control Channel): the channel is used for a video/audio/control channel, provides a service data channel by upward encapsulation, and is responsible for video frames/logic frames/control signaling corresponding to a serialization/deserialization channel downwards.
Tcp/Quic/NST channel Tunnel (illustrated as Tcp channel, Quic channel, NST channel): the device is used for binary data transmission of the network IO layer and provides synchronous/asynchronous read-write capability. NST Tunnel is an autonomously designed near-field low-latency customized streaming media protocol.
As shown in fig. 3D, the redirection initiation end notifies the redirection manager to establish one or more application-level redirection services through the ARS initiation module, each application-level redirection service transmits audio/video/control information and responds to the control information, and realizes communication with the redirection receiving end through the connection manager, the redirection receiving end establishes one or more application-level redirection services through the redirection manager of the local end, each application-level redirection service decodes, renders, and displays video, decodes and plays audio, and receives and returns control information.
The ARS refers to an application level redirection service, and provides an application picture running on the device to a third-party device in a video stream form, so that the third-party device is allowed to display and operate an application on the current device. And the simultaneous running of a plurality of applications is supported, and the operation habit is consistent with that of the pc.
The embodiment of the application conforms to the development concept of mutual fusion of everything and provides a set of more unified, flexible and universal redirection framework. The method supports mainstream intelligent terminal equipment, an operating system, a connection mode, a data channel and a transmission protocol, shields the heterogeneity of the terminal equipment, has high expansibility at each layer and module, and can continuously expand and adapt.
With the increasing demand of interconnection and intercommunication, the current solution can only meet partial requirements of users, and the users have to adopt multiple sets of solutions simultaneously to cover the solutions. The invention can build various intelligent interconnection scenes under the same technical scheme, including one-to-one, many-to-one, one-to-many, MVC (operation/display/control separation), application level screen projection, application stream conversion and the like. Some of the screen projection modes are innovatively proposed in the embodiments of the present application, such as the MVC mode.
Device and platform compatibility is extensible (e.g., future smart glasses); the intelligent interconnection scene can be expanded; the data channel is expandable; the transmission protocol is extensible; the connection mode is expandable; the codec is scalable.
The embodiment of the application provides an information processing device which can be a terminal. Specifically, the information processing apparatus is configured to perform the steps performed by the terminal in the above method for displaying desktop metadata. The information processing apparatus provided in the embodiment of the present application may include a module corresponding to the corresponding step.
In the embodiment of the present application, the information processing apparatus may be divided into the functional modules according to the method example, for example, each functional module may be divided according to each function, or two or more functions may be integrated into one processing module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The division of the modules in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
Fig. 4 is a schematic diagram showing a possible configuration of the information processing apparatus according to the above-described embodiment, in a case where each functional module is divided in correspondence with each function. As shown in fig. 4, the information processing apparatus 4 includes:
an output unit 40 for outputting target content from the second electronic device;
a receiving unit 41, configured to receive the updated target content from the second electronic device, where the updated target content is obtained by the second electronic device performing the following operations: receiving control information aiming at the target content from a target device, updating the target content according to the control information, and sending the updated target content to the first electronic device;
an output unit 42, configured to output the updated target content.
In one possible example, the target device includes the first electronic device or a third electronic device other than the first electronic device and the second electronic device.
In one possible example, the target content is at least one of the following information actually output by the second electronic device: audio, video, pictures, text, motion information.
All relevant contents of each step related to the above method embodiment may be referred to the functional description of the corresponding functional module, and are not described herein again.
In the case of using an integrated unit, a schematic structural diagram of an information processing apparatus provided in an embodiment of the present application is shown in fig. 5. In fig. 5, the information processing apparatus 5 includes: a processing module 50 and a communication module 51. Processing module 50 is used to control and manage actions of the information processing apparatus, such as steps performed by output unit 40, and/or other processes for performing the techniques described herein. The communication module 51 is used to support interaction between the information processing apparatus and other devices. As shown in fig. 4, the information processing apparatus may further include a storage module 52, and the storage module 52 is configured to store program codes and data of the information processing apparatus.
The Processing module 50 may be a Processor or a controller, and may be, for example, a Central Processing Unit (CPU), a general-purpose Processor, a Digital Signal Processor (DSP), an ASIC, an FPGA or other programmable logic device, a transistor logic device, a hardware component, or any combination thereof. Which may implement or perform the various illustrative logical blocks, modules, and circuits described in connection with the disclosure. The processor may also be a combination of computing functions, e.g., comprising one or more microprocessors, DSPs, and microprocessors, among others. The communication module 51 may be a transceiver, an RF circuit or a communication interface, etc. The storage module may be a memory.
All relevant contents of each scene related to the method embodiment may be referred to the functional description of the corresponding functional module, and are not described herein again. The information processing apparatus may perform the steps performed by the terminal in the method for displaying desktop metadata shown in fig. 1A.
The embodiment of the application provides an information processing device which can be a terminal. Specifically, the information processing apparatus is configured to perform the steps performed by the terminal in the above method for displaying desktop metadata. The information processing apparatus provided in the embodiment of the present application may include a module corresponding to the corresponding step.
In the embodiment of the present application, the information processing apparatus may be divided into the functional modules according to the method example, for example, each functional module may be divided according to each function, or two or more functions may be integrated into one processing module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The division of the modules in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
Fig. 6 is a schematic diagram showing a possible configuration of the information processing apparatus according to the above-described embodiment, in a case where each functional module is divided in correspondence with each function. As shown in fig. 6, the information processing apparatus 6 includes:
an output unit 60, configured to output target content from a second electronic device, where the target content is information that is not actually output by the second electronic device in a state where a target application is running.
In one possible example, the target application is a background running application.
In one possible example, the target application includes a plurality of applications; in terms of the first electronic device outputting the target content from the second electronic device, the output unit 60 is specifically configured to: and displaying the interface data of the plurality of applications in a multi-window mode.
In one possible example, the target content includes at least one of the following information: audio, video, pictures, text, motion information.
All relevant contents of each step related to the above method embodiment may be referred to the functional description of the corresponding functional module, and are not described herein again.
In the case of using an integrated unit, a schematic structural diagram of an information processing apparatus provided in an embodiment of the present application is shown in fig. 7. In fig. 7, the information processing apparatus 7 includes: a processing module 70 and a communication module 71. Processing module 70 is used to control and manage the actions of the information processing apparatus, e.g., the steps performed by output unit 60, and/or other processes for performing the techniques described herein. The communication module 71 is used to support interaction between the information processing apparatus and other devices. As shown in fig. 6, the information processing apparatus may further include a storage module 72, and the storage module 72 is used to store program codes and data of the information processing apparatus.
The Processing module 70 may be a Processor or a controller, and may be, for example, a Central Processing Unit (CPU), a general-purpose Processor, a Digital Signal Processor (DSP), an ASIC, an FPGA or other programmable logic device, a transistor logic device, a hardware component, or any combination thereof. Which may implement or perform the various illustrative logical blocks, modules, and circuits described in connection with the disclosure. The processor may also be a combination of computing functions, e.g., comprising one or more microprocessors, DSPs, and microprocessors, among others. The communication module 71 may be a transceiver, an RF circuit or a communication interface, etc. The storage module may be a memory.
All relevant contents of each scene related to the method embodiment may be referred to the functional description of the corresponding functional module, and are not described herein again. The information processing apparatus may perform the steps performed by the terminal in the method for displaying desktop metadata shown in fig. 2A.
The embodiment of the application provides an information processing device which can be a terminal. Specifically, the information processing apparatus is configured to perform the steps performed by the terminal in the above method for displaying desktop metadata. The information processing apparatus provided in the embodiment of the present application may include a module corresponding to the corresponding step.
In the embodiment of the present application, the information processing apparatus may be divided into the functional modules according to the method example, for example, each functional module may be divided according to each function, or two or more functions may be integrated into one processing module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The division of the modules in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
Fig. 8 shows a schematic diagram of a possible configuration of the information processing apparatus according to the above-described embodiment, in the case of employing a division of each functional module corresponding to each function. As shown in fig. 8, the information processing apparatus 8 includes:
a detecting unit 80, configured to detect a trigger operation for transferring a target content to a second electronic device, where the target content is application data generated by a target application in an operating state;
a sending unit 81, configured to send the target content to the second electronic device, where the target content is used for the second electronic device to output the target content.
In one possible example, the triggering operation includes any one of: touch control operation, gesture operation, voice operation and short-distance wireless communication;
the target content includes at least one of the following information: audio, video, pictures, text, motion information.
In one possible example, in the aspect of sending the target content to the second electronic device, the sending unit 81 is specifically configured to: indirectly sending the target content to the second electronic device through a multi-device management server; or, the target content is directly sent to the second electronic device through a preset communication network.
In one possible example, the target content is application data generated by a target application running on the third electronic device in a running state; in the aspect of sending the target content to the second electronic device, the sending unit 81 is specifically configured to: transmitting a streaming request to the third electronic device, the streaming request including device information of the second electronic device as a content receiving device, the streaming request being used for the third electronic device to perform the following operations: indirectly sending the target content to the second electronic device through a multi-device management server; or, the target content is directly sent to the second electronic device through a preset communication network.
In one possible example, the apparatus further includes a display unit configured to cancel displaying the target content.
All relevant contents of each step related to the above method embodiment may be referred to the functional description of the corresponding functional module, and are not described herein again.
In the case of using an integrated unit, a schematic structural diagram of an information processing apparatus provided in an embodiment of the present application is shown in fig. 9. In fig. 9, the information processing apparatus 9 includes: a processing module 90 and a communication module 91. Processing module 90 is used to control and manage actions of the information processing device, such as steps performed by display unit 80, and/or other processes for performing the techniques described herein. The communication module 91 is used to support interaction between the information processing apparatus and other devices. As shown in fig. 8, the information processing apparatus may further include a storage module 92, and the storage module 92 is used to store program codes and data of the information processing apparatus.
The Processing module 90 may be a Processor or a controller, and may be, for example, a Central Processing Unit (CPU), a general-purpose Processor, a Digital Signal Processor (DSP), an ASIC, an FPGA or other programmable logic device, a transistor logic device, a hardware component, or any combination thereof. Which may implement or perform the various illustrative logical blocks, modules, and circuits described in connection with the disclosure. The processor may also be a combination of computing functions, e.g., comprising one or more microprocessors, DSPs, and microprocessors, among others. The communication module 91 may be a transceiver, an RF circuit or a communication interface, etc. The storage module may be a memory.
All relevant contents of each scene related to the method embodiment may be referred to the functional description of the corresponding functional module, and are not described herein again. The information processing apparatus may perform the steps performed by the terminal in the method for displaying desktop metadata shown in fig. 3A.
Claims (17)
1. An information processing method characterized by comprising:
the first electronic device outputs target content from the second electronic device;
the first electronic device receives the updated target content from the second electronic device, wherein the updated target content is obtained by the second electronic device by performing the following operations: receiving control information aiming at the target content from a target device, updating the target content according to the control information, and sending the updated target content to the first electronic device;
and the first electronic equipment outputs the updated target content.
2. The method of claim 1, wherein the target device comprises the first electronic device or a third electronic device other than the first electronic device and the second electronic device.
3. The method according to claim 1 or 2, wherein the target content is at least one of the following information actually output by the second electronic device: audio, video, pictures, text, motion information.
4. An information processing method characterized by comprising:
the first electronic device outputs target content from the second electronic device, wherein the target content is information which is not actually output by the second electronic device in a state of running a target application.
5. The method of claim 4, wherein the target application is a background running application.
6. The method of claim 4 or 5, wherein the target application comprises a plurality of applications; the first electronic device outputs target content from a second electronic device, including:
and the first electronic equipment displays the interface data of the plurality of applications in a multi-window mode.
7. The method of any of claims 4-6, wherein the target content comprises at least one of: audio, video, pictures, text, motion information.
8. An information processing method characterized by comprising:
the method comprises the steps that a first electronic device detects a trigger operation of transferring target content to a second electronic device, wherein the target content is application data generated by a target application in an operating state;
the first electronic device sends the target content to the second electronic device, and the target content is used for the second electronic device to output the target content.
9. The method of claim 8, wherein the triggering operation comprises any one of: touch control operation, gesture operation, voice operation and short-distance wireless communication;
the target content includes at least one of the following information: audio, video, pictures, text, motion information.
10. The method of claim 8 or 9, wherein the first electronic device sending the target content to the second electronic device comprises:
the first electronic device indirectly sends the target content to the second electronic device through a multi-device management server; or,
and the first electronic equipment directly sends the target content to the second electronic equipment through a preset communication network.
11. The method according to claim 8 or 9, wherein the target content is application data generated by a target application running on a third electronic device in a running state; the first electronic device sends the target content to the second electronic device, and the target content comprises:
the first electronic device transmits a streaming request to the third electronic device, the streaming request including device information of the second electronic device as a content receiving device, the streaming request being used for the third electronic device to perform the following operations: indirectly sending the target content to the second electronic device through a multi-device management server; or, the target content is directly sent to the second electronic device through a preset communication network.
12. The method of any of claims 8-11, wherein after the first electronic device transmits the targeted content to the second electronic device, the method further comprises:
and the first electronic equipment cancels the display of the target content.
13. An information processing apparatus characterized by comprising:
an output unit for outputting the target content from the second electronic device;
a receiving unit, configured to receive the updated target content from the second electronic device, where the updated target content is obtained by the second electronic device performing the following operations: receiving control information aiming at the target content from a target device, updating the target content according to the control information, and sending the updated target content to the first electronic device;
and the output unit is used for outputting the updated target content.
14. An information processing apparatus characterized by comprising:
and the output unit is used for outputting target content from the second electronic equipment, wherein the target content is information which is not actually output by the second electronic equipment in a state of running the target application.
15. An information processing apparatus characterized by comprising:
the detection unit is used for detecting a trigger operation of transferring target content to second electronic equipment, wherein the target content is application data generated by a target application in an operating state;
a sending unit, configured to send the target content to the second electronic device, where the target content is used for the second electronic device to output the target content.
16. An electronic device comprising a processor, a memory, and one or more programs stored in the memory and configured to be executed by the processor, the programs comprising instructions for performing the steps in the method of any of claims 1-12.
17. A computer-readable storage medium, characterized in that a computer program for electronic data exchange is stored, wherein the computer program causes a computer to perform the method according to any one of claims 1-12.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011281262.1A CN112383803B (en) | 2020-11-16 | 2020-11-16 | Information processing method and related device |
PCT/CN2021/121290 WO2022100308A1 (en) | 2020-11-16 | 2021-09-28 | Information processing method and related apparatus |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011281262.1A CN112383803B (en) | 2020-11-16 | 2020-11-16 | Information processing method and related device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112383803A true CN112383803A (en) | 2021-02-19 |
CN112383803B CN112383803B (en) | 2023-04-11 |
Family
ID=74585583
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011281262.1A Active CN112383803B (en) | 2020-11-16 | 2020-11-16 | Information processing method and related device |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN112383803B (en) |
WO (1) | WO2022100308A1 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112860367A (en) * | 2021-03-04 | 2021-05-28 | 康佳集团股份有限公司 | Equipment interface visualization method, intelligent terminal and computer readable storage medium |
CN113676384A (en) * | 2021-07-26 | 2021-11-19 | 青岛海尔科技有限公司 | Method and device for controlling screen projection of equipment, server and storage medium |
CN114157903A (en) * | 2021-12-02 | 2022-03-08 | Oppo广东移动通信有限公司 | Redirection method, redirection device, redirection equipment, storage medium and program product |
WO2022100308A1 (en) * | 2020-11-16 | 2022-05-19 | Oppo广东移动通信有限公司 | Information processing method and related apparatus |
CN115729502A (en) * | 2022-03-23 | 2023-03-03 | 博泰车联网(南京)有限公司 | Response method of screen projection terminal and display terminal, electronic device and storage medium |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118377496A (en) * | 2022-11-29 | 2024-07-23 | 华为技术有限公司 | Application cross-equipment circulation method, related device and communication system |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103324457A (en) * | 2013-06-21 | 2013-09-25 | 东莞宇龙通信科技有限公司 | Terminal and multi-task data display method |
CN106095084A (en) * | 2016-06-06 | 2016-11-09 | 乐视控股(北京)有限公司 | Throw screen method and device |
US20170272488A1 (en) * | 2014-08-25 | 2017-09-21 | Unify Gmbh & Co. Kg | Method for controlling a multimedia application, software product and device |
CN108958678A (en) * | 2017-05-25 | 2018-12-07 | 阿里巴巴集团控股有限公司 | Throw screen method, the sharing method of screen content and device |
CN110381195A (en) * | 2019-06-05 | 2019-10-25 | 华为技术有限公司 | A kind of throwing screen display methods and electronic equipment |
CN111314768A (en) * | 2020-02-24 | 2020-06-19 | 北京小米移动软件有限公司 | Screen projection method, screen projection device, electronic equipment and computer readable storage medium |
CN211429437U (en) * | 2019-10-12 | 2020-09-04 | 中冶赛迪重庆信息技术有限公司 | Intelligent screen projection system |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20200014986A1 (en) * | 2016-06-02 | 2020-01-09 | John Senew | Apparatus and method for displaying video |
CN111107405A (en) * | 2019-12-27 | 2020-05-05 | 北京比利信息技术有限公司 | Screen projection method, server, screen projection system and storage medium |
CN111263218A (en) * | 2020-02-24 | 2020-06-09 | 卓望数码技术(深圳)有限公司 | Method and system for realizing synchronous interaction of multiple devices |
CN111327769B (en) * | 2020-02-25 | 2022-04-08 | 北京小米移动软件有限公司 | Multi-screen interaction method and device and storage medium |
CN111510751A (en) * | 2020-04-26 | 2020-08-07 | 深圳市易平方网络科技有限公司 | Cross-screen interactive advertisement delivery method, system and storage medium |
CN111767012A (en) * | 2020-05-29 | 2020-10-13 | 维沃移动通信有限公司 | Screen projection method and device |
CN111752518A (en) * | 2020-06-19 | 2020-10-09 | 海信视像科技股份有限公司 | Screen projection method of display equipment and display equipment |
CN112383803B (en) * | 2020-11-16 | 2023-04-11 | Oppo广东移动通信有限公司 | Information processing method and related device |
-
2020
- 2020-11-16 CN CN202011281262.1A patent/CN112383803B/en active Active
-
2021
- 2021-09-28 WO PCT/CN2021/121290 patent/WO2022100308A1/en active Application Filing
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103324457A (en) * | 2013-06-21 | 2013-09-25 | 东莞宇龙通信科技有限公司 | Terminal and multi-task data display method |
US20170272488A1 (en) * | 2014-08-25 | 2017-09-21 | Unify Gmbh & Co. Kg | Method for controlling a multimedia application, software product and device |
CN106095084A (en) * | 2016-06-06 | 2016-11-09 | 乐视控股(北京)有限公司 | Throw screen method and device |
CN108958678A (en) * | 2017-05-25 | 2018-12-07 | 阿里巴巴集团控股有限公司 | Throw screen method, the sharing method of screen content and device |
CN110381195A (en) * | 2019-06-05 | 2019-10-25 | 华为技术有限公司 | A kind of throwing screen display methods and electronic equipment |
CN211429437U (en) * | 2019-10-12 | 2020-09-04 | 中冶赛迪重庆信息技术有限公司 | Intelligent screen projection system |
CN111314768A (en) * | 2020-02-24 | 2020-06-19 | 北京小米移动软件有限公司 | Screen projection method, screen projection device, electronic equipment and computer readable storage medium |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022100308A1 (en) * | 2020-11-16 | 2022-05-19 | Oppo广东移动通信有限公司 | Information processing method and related apparatus |
CN112860367A (en) * | 2021-03-04 | 2021-05-28 | 康佳集团股份有限公司 | Equipment interface visualization method, intelligent terminal and computer readable storage medium |
CN112860367B (en) * | 2021-03-04 | 2023-12-12 | 康佳集团股份有限公司 | Equipment interface visualization method, intelligent terminal and computer readable storage medium |
CN113676384A (en) * | 2021-07-26 | 2021-11-19 | 青岛海尔科技有限公司 | Method and device for controlling screen projection of equipment, server and storage medium |
CN114157903A (en) * | 2021-12-02 | 2022-03-08 | Oppo广东移动通信有限公司 | Redirection method, redirection device, redirection equipment, storage medium and program product |
CN115729502A (en) * | 2022-03-23 | 2023-03-03 | 博泰车联网(南京)有限公司 | Response method of screen projection terminal and display terminal, electronic device and storage medium |
CN115729502B (en) * | 2022-03-23 | 2024-02-27 | 博泰车联网(南京)有限公司 | Screen-throwing end and display end response method, electronic equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
WO2022100308A1 (en) | 2022-05-19 |
CN112383803B (en) | 2023-04-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112383803B (en) | Information processing method and related device | |
CN105656749B (en) | Distributed wireless multi-screen Virtual PC service system | |
CN102866828B (en) | A kind of terminal control method and equipment | |
WO2021143182A1 (en) | Game processing method and apparatus, electronic device, and computer-readable storage medium | |
US20220239718A1 (en) | Communication Protocol Switching Method, Apparatus, and System | |
WO2022121775A1 (en) | Screen projection method, and device | |
JP2023503679A (en) | MULTI-WINDOW DISPLAY METHOD, ELECTRONIC DEVICE AND SYSTEM | |
JP6492227B2 (en) | Method for content projection and mobile terminal | |
WO2010008230A2 (en) | Apparatus and method for providing user interface service in a multimedia system | |
CN107168666A (en) | The system and method for USB interface-based audio video transmission and multi-screen mapping | |
CN104081374B (en) | Classification display device server system and method | |
CN103516882B (en) | A kind of based on multi-screen interaction scene picture playing method and system | |
CN105072507B (en) | A kind of transmission method and system of multi-medium data | |
KR20170059474A (en) | Presentation of computing environment on multiple devices | |
CN203352696U (en) | Multimedia digit conference system | |
CN114339332B (en) | Mobile terminal, display device and cross-network screen projection method | |
CN111970546A (en) | Method and device for controlling terminal interaction, electronic equipment and storage medium | |
CN103338346A (en) | Method and system for realizing multimedia digital conference | |
CN114286152A (en) | Display device, communication terminal and screen projection picture dynamic display method | |
WO2023011058A1 (en) | Display device, communication terminal, and projected-screen image dynamic display method | |
CN101399905A (en) | Interactive set-top box | |
WO2023284498A1 (en) | Video playing method and apparatus, and storage medium | |
CN114630101B (en) | Display device, VR device and display control method of virtual reality application content | |
CN116264619A (en) | Resource processing method, device, server, terminal, system and storage medium | |
CN201127036Y (en) | Apparatus for implementing built-in equipment resource share based on p2p technique |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |