CN111741341A - Shared screen processing apparatus and shared screen processing method - Google Patents

Shared screen processing apparatus and shared screen processing method Download PDF

Info

Publication number
CN111741341A
CN111741341A CN202010524355.6A CN202010524355A CN111741341A CN 111741341 A CN111741341 A CN 111741341A CN 202010524355 A CN202010524355 A CN 202010524355A CN 111741341 A CN111741341 A CN 111741341A
Authority
CN
China
Prior art keywords
screen
shared
special effect
user interface
interface layer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010524355.6A
Other languages
Chinese (zh)
Inventor
文闻
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics China R&D Center
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics China R&D Center
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics China R&D Center, Samsung Electronics Co Ltd filed Critical Samsung Electronics China R&D Center
Priority to CN202010524355.6A priority Critical patent/CN111741341A/en
Publication of CN111741341A publication Critical patent/CN111741341A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4122Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4126The peripheral being portable, e.g. PDAs or mobile phones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video stream to a specific local network, e.g. a Bluetooth® network
    • H04N21/43637Adapting the video stream to a specific local network, e.g. a Bluetooth® network involving a wireless protocol, e.g. Bluetooth, RF or wireless LAN [IEEE 802.11]

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Software Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

A shared screen processing apparatus and a shared screen processing method are disclosed. The shared screen processing method includes: extracting a user interface layer corresponding to a screen to be shared, and executing special effect processing on the user interface layer; synthesizing the user interface layer which is subjected to the special effect processing with a screen to be shared to generate a shared screen; and performing mobile screen projection on the generated shared screen.

Description

Shared screen processing apparatus and shared screen processing method
Technical Field
The present disclosure relates to sharing screens between electronic terminals, and more particularly, to a shared screen processing apparatus and a shared screen processing method capable of adding a special effect.
Background
Sharing a screen (i.e., projecting a screen) refers to synchronizing an image displayed and/or a played video on one electronic terminal to another electronic terminal for display and playing. Typically, screens on mobile electronic terminals may be projected onto computers and/or televisions for display on larger sized screens.
Disclosure of Invention
An aspect of the present disclosure is to provide a shared screen processing apparatus and a shared screen processing method that apply a special effect to shared screen contents, thereby improving privacy and visibility, improving operability, and improving user experience.
In one general aspect, there is provided a shared screen processing method including: extracting a user interface layer corresponding to a screen to be shared, and executing special effect processing on the user interface layer; synthesizing the user interface layer which is subjected to the special effect processing with a screen to be shared to generate a shared screen; and performing mobile screen projection on the generated shared screen.
Optionally, the step of extracting a user interface layer corresponding to the screen to be shared, and performing special effect processing on the user interface layer includes: displaying the extracted user interface layer and providing a special effects menu including at least one special effect.
Optionally, the step of extracting a user interface layer corresponding to the screen to be shared, and performing special effect processing on the user interface layer further includes: performing a special effect process on the extracted user interface layer in response to a user selection of a special effect included in the special effect menu and an operation performed using the selected special effect.
Optionally, the effects included in the effects menu include mosaic, graffiti, sticker, cut, zoom, text, animation.
Optionally, the special effect processing performed on the extracted user interface layer includes: adding mosaic, doodle, characters and animation on the extracted user interface layer, cutting the extracted user interface layer, and/or scaling all or part of the extracted user interface layer.
Optionally, the step of synthesizing the user interface layer on which the special effect processing is performed with the screen to be shared includes: and identifying the content displayed by the screen to be shared, and synthesizing the user interface layer which is subjected to special effect processing with the screen to be shared based on the identification result.
Optionally, the step of synthesizing the user interface layer on which the special effect processing is performed with the screen to be shared includes: when the identification result indicates that the picture content is displayed on the screen to be shared, synthesizing the part of the user interface layer, which is subjected to the special effect processing, into the picture content; and when the recognition result indicates that the screen to be shared displays the video content, synthesizing a portion of the user interface layer on which the special effect processing is performed to the video content, and canceling the performed special effect processing as the portion on which the special effect processing is performed disappears from the video content.
Optionally, the step of synthesizing the user interface layer on which the special effect processing is performed with the screen to be shared further includes: when a portion on which the special effect processing is performed appears again in the video content after disappearing from the video content, the special effect processing performed on the portion is resumed.
Optionally, the step of performing mobile screen projection on the generated shared screen includes: and performing mobile screen projection on the generated sharing screen to share the generated sharing screen to the mobile electronic terminal or the fixed electronic terminal.
In another general aspect, there is provided a shared screen processing apparatus including: the editing module is configured to extract a user interface layer corresponding to a screen to be shared and execute special effect processing on the user interface layer; the synthesis module is configured to synthesize the user interface layer which is subjected to the special effect processing with a screen to be shared so as to generate a sharing screen; and a transmitting module configured to perform mobile screen projection on the generated shared screen.
Optionally, the editing module is further configured to display the extracted user interface layer and provide a special effects menu including at least one special effect.
Optionally, the editing module is further configured to perform a special effect process on the extracted user interface layer in response to a user selection of a special effect included in the special effect menu and an operation performed using the selected special effect.
Optionally, the effects included in the effects menu include mosaic, graffiti, sticker, cut, zoom, text, animation.
Optionally, the special effect processing performed on the extracted user interface layer includes: adding mosaic, doodle, text and animation on the extracted user interface layer, clipping the extracted user interface layer, and/or scaling all or part of the extracted user interface layer.
Optionally, the synthesis module is configured to: and identifying the content displayed by the screen to be shared, and synthesizing the user interface layer which is subjected to special effect processing with the screen to be shared based on the identification result.
Optionally, the synthesis module is configured to: when the identification result indicates that the picture content is displayed on the screen to be shared, synthesizing the part of the user interface layer, which is subjected to the special effect processing, into the picture content; and when the recognition result indicates that the screen to be shared displays the video content, synthesizing a portion of the user interface layer on which the special effect processing is performed to the video content, and canceling the performed special effect processing as the portion on which the special effect processing is performed disappears from the video content.
Optionally, the synthesis module is further configured to: when a portion on which the special effect processing is performed appears again in the video content after disappearing from the video content, the special effect processing performed on the portion is resumed.
Optionally, the sending module is configured to perform mobile screen projection on the generated sharing screen to share the generated sharing screen to the mobile electronic terminal or the stationary electronic terminal.
In another general aspect, there is provided a computer readable storage medium storing a computer program which, when executed by a processor, implements the shared screen processing method as described above.
In another general aspect, there is provided an electronic terminal, including: a processor; and a memory storing a computer program which, when executed by the processor, implements the shared screen processing method as described above.
In the shared screen processing apparatus and the shared screen processing method according to the exemplary embodiments of the present disclosure, by applying a special effect to the shared screen content, it is possible to effectively block content that is not desired to be shared, thereby improving privacy. In addition, by applying special effects to the shared screen content, cropping and/or scaling the screen content, the visibility of the screen content can be improved. In addition, by applying special effects to the shared screen content and inserting graffiti, stickers, characters, animations and the like, the operability of the screen content can be improved, and the user experience can be improved.
Additional aspects and/or advantages of the present general inventive concept will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the general inventive concept.
Drawings
The above and other objects and features of exemplary embodiments of the present disclosure will become more apparent from the following description taken in conjunction with the accompanying drawings which illustrate exemplary embodiments, wherein:
fig. 1 is a block diagram illustrating a configuration of a shared screen processing apparatus according to an exemplary embodiment of the present disclosure;
fig. 2 is a flowchart illustrating an example of a shared screen processing method according to an exemplary embodiment of the present disclosure;
fig. 3 is a diagram illustrating one example of a sharing screen according to an exemplary embodiment of the present disclosure;
fig. 4 is a diagram illustrating another example of a sharing screen according to an exemplary embodiment of the present disclosure.
Detailed Description
The following detailed description is provided to assist the reader in obtaining a thorough understanding of the methods, devices, and/or systems described herein. However, various changes, modifications, and equivalents of the methods, apparatus, and/or systems described herein will be apparent to those skilled in the art after reviewing the disclosure of the present application. For example, the order of operations described herein is merely an example, and is not limited to those set forth herein, but may be changed as will become apparent after understanding the disclosure of the present application, except to the extent that operations must occur in a particular order. Moreover, descriptions of features known in the art may be omitted for clarity and conciseness.
The features described herein may be embodied in different forms and should not be construed as limited to the examples described herein. Rather, the examples described herein have been provided to illustrate only some of the many possible ways to implement the methods, devices, and/or systems described herein, which will be apparent after understanding the disclosure of the present application.
As used herein, the term "and/or" includes any one of the associated listed items and any combination of any two or more.
Although terms such as "first", "second", and "third" may be used herein to describe various elements, components, regions, layers or sections, these elements, components, regions, layers or sections should not be limited by these terms. Rather, these terms are only used to distinguish one element, component, region, layer or section from another element, component, region, layer or section. Thus, a first element, component, region, layer or section referred to in the examples described herein could also be referred to as a second element, component, region, layer or section without departing from the teachings of the examples.
In the specification, when an element (such as a layer, region or substrate) is described as being "on," "connected to" or "coupled to" another element, it can be directly on, connected to or coupled to the other element or one or more other elements may be present therebetween. In contrast, when an element is referred to as being "directly on," "directly connected to," or "directly coupled to" another element, there may be no intervening elements present.
The terminology used herein is for the purpose of describing various examples only and is not intended to be limiting of the disclosure. The singular is also intended to include the plural unless the context clearly indicates otherwise. The terms "comprises," "comprising," and "having" specify the presence of stated features, quantities, operations, elements, components, and/or combinations thereof, but do not preclude the presence or addition of one or more other features, quantities, operations, components, elements, and/or combinations thereof.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure belongs after understanding the present disclosure. Unless explicitly defined as such herein, terms (such as those defined in general dictionaries) should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and the present disclosure, and should not be interpreted in an idealized or overly formal sense.
Further, in the description of the examples, when it is considered that detailed description of well-known related structures or functions will cause a vague explanation of the present disclosure, such detailed description will be omitted.
Hereinafter, embodiments will be described in detail with reference to the accompanying drawings. Embodiments, however, may be embodied in various forms and are not limited to the examples described herein.
Fig. 1 is a block diagram illustrating a configuration of a shared screen processing apparatus according to an exemplary embodiment of the present disclosure. The shared screen processing apparatus as shown in fig. 1 may be implemented in various electronic terminals. Various electronic terminals include, but are not limited to, mobile phones, smart devices, tablet devices, televisions, wearable devices (such as smart watches), and the like.
Referring to fig. 1, a shared screen processing apparatus 100 according to an exemplary embodiment of the present disclosure includes an editing module 110, a composition module 120, and a transmission module 130. Here, the shared screen processing apparatus 100 may be implemented in a Central Processing Unit (CPU), a Graphic Processing Unit (GPU), or an Application Processor (AP) in the electronic terminal, but is not limited thereto.
The editing module 110 may extract a User Interface (UI) layer corresponding to a screen to be shared and perform a special effect process on the UI layer. Specifically, the editing module 110 may display the extracted UI layer and provide a special effects menu including at least one special effect on the extracted UI layer. The effects included in the effects menu may include, but are not limited to, mosaic, graffiti, stickers, cuts, zooms, text, animations, and the like. When a user of the electronic terminal selects a special effect included in the special effect menu and performs an operation on the extracted UI layer using the selected special effect (e.g., without limitation, adding a mosaic, a sticker, text, an animation, etc. on the extracted UI layer, cropping the extracted UI layer, etc., and/or performing scaling on all or part of the extracted UI layer, etc.), the editing module 110 may perform a special effect process on the extracted UI layer (i.e., adding a mosaic, a sticker, text, an animation, cropping, performing scaling, etc., as described above) in response to the user's selection of the special effect included in the special effect menu and the operation performed using the selected special effect.
The composition module 120 may combine the UI layer on which the special effect processing is performed with the screen to be shared to generate a shared screen. In the operation of generating the sharing screen, the composition module 120 may identify the content displayed by the screen to be shared, and then compose the user interface layer on which the special effect process is performed with the screen to be shared based on the identification result. Specifically, when the recognition result indicates that the screen to be shared displays picture content, the composition module 120 may combine the portion of the UI layer on which the special effect process is performed to the picture content. When the recognition result indicates that the screen to be shared displays video content, the composition module 120 may compose a portion of the user interface layer on which the special effect process is performed to the video content, and cancel the performed special effect process as the portion on which the special effect process is performed disappears from the video content. For example, when a mosaic special effect is added to a specific object appearing in video content, the added mosaic special effect is also cancelled as the specific object disappears from the video content (e.g., the scene of the video content changes so that the specific object no longer appears in the video scene). Further, when a portion on which the special effect processing is performed reappears in the video content after disappearing from the video content, the composition module 120 may restore the special effect processing performed on the portion. For example, when a specific object to which a mosaic special effect is added reappears in the video content as time passes after disappearing from the video content, the mosaic special effect will be restored to the specific object.
The transmitting module 130 may perform mobile screen projection on the generated shared screen. In particular, the transmitting module 130 may perform mobile screen projection on the generated sharing screen to share the generated sharing screen to the mobile electronic terminal or the stationary electronic terminal. For example, a mobile electronic terminal may include a mobile phone, a smart phone, a tablet device, etc., and a stationary electronic terminal may include a set-top box, a television, a projector, etc.
As described above, in the shared screen processing apparatus 100 according to the exemplary embodiment of the present disclosure, by applying a special effect to the shared screen content, it is possible to effectively block content that is not desired to be shared, thereby improving privacy. Further, in the shared screen processing apparatus 100 according to the exemplary embodiment of the present disclosure, by applying a special effect to the shared screen content, and cropping and/or scaling the screen content, the visibility of the screen content can be improved. Further, in the shared screen processing apparatus 100 according to the exemplary embodiment of the present disclosure, by applying a special effect to the shared screen content, inserting graffiti, stickers, characters, animation, and the like, the operability of the screen content can be improved, and the user experience can be improved.
Fig. 2 is a flowchart illustrating an example of a shared screen processing method according to an exemplary embodiment of the present disclosure.
Referring to fig. 2, according to the shared screen processing method, in step S201, a User Interface (UI) layer corresponding to a screen to be shared is extracted, and a special effect process is performed on the UI layer. Specifically, in step S201, the extracted UI layer may be displayed and a special effects menu including at least one special effect may be provided. Further, a special effect process may be performed on the extracted UI layer in response to a user's selection of a special effect included in the special effect menu and an operation performed using the selected special effect. As described above, the effects included in the effects menu may include, but are not limited to, mosaics, graffiti, stickers, cuts, zooms, text, animations, and the like. The special effect processing performed on the extracted UI layer may include, but is not limited to: adding mosaic, doodle, characters and animation on the selected UI layer; cutting the extracted UI layer; and performing scaling on all or part of the extracted UI layer.
In step S202, the UI layer on which the special effect processing is performed may be synthesized with the screen to be shared to generate a shared screen. Specifically, in step S202, the content displayed on the screen to be shared can be identified, and the UI layer on which the special effect processing is performed is synthesized with the screen to be shared based on the identification result. And when the identification result indicates that the screen to be shared displays the picture content, synthesizing the part of the UI layer, which is subjected to the special effect processing, into the picture content. When the recognition result indicates that the screen to be shared displays video content, a portion of the UI layer on which the special effect processing is performed is composited to the video content, and the performed special effect processing is cancelled as the portion on which the special effect processing is performed disappears from the video content. Further, when a portion on which the special effect processing is performed appears again in the video content after disappearing from the video content, the special effect processing performed on the portion is resumed.
In step S203, a mobile screen projection may be performed on the generated shared screen. In particular, mobile screen projection may be performed on the generated sharing screen to share the generated sharing screen to the mobile electronic terminal or the stationary electronic terminal.
As described above, in the shared screen processing method according to the exemplary embodiment of the present disclosure, by applying a special effect to the shared screen content, it is possible to effectively block content that is not desired to be shared, thereby improving privacy. Further, according to the shared screen processing method of the exemplary embodiment of the present disclosure, visibility of screen contents can be improved by applying a special effect to the shared screen contents, and cropping and/or scaling the screen contents. In addition, according to the shared screen processing method of the exemplary embodiment of the present disclosure, by applying a special effect to the shared screen content and inserting graffiti, stickers, characters, animations, and the like, the operability of the screen content can be improved, and the user experience can be improved.
Fig. 3 is a diagram illustrating one example of a sharing screen according to an exemplary embodiment of the present disclosure.
The left side in fig. 3 shows the extracted UI layer. And a special effect menu comprising a plurality of special effects is arranged at the lower part of the UI layer. And adding a mosaic special effect and a doodle special effect on the extracted UI layer. The right side of fig. 3 shows a sharing screen generated by synthesizing the UI layer on which the special effect process is performed with the screen to be shared and projected to another electronic terminal.
Fig. 4 is a diagram illustrating another example of a sharing screen according to an exemplary embodiment of the present disclosure.
The left side in fig. 4 shows the extracted UI layer. And a special effect menu comprising a plurality of special effects is arranged at the lower part of the UI layer. And clipping the extracted UI layer, and amplifying the clipped part. The right side of fig. 4 shows a sharing screen generated by synthesizing the UI layer on which the special effect process is performed with the screen to be shared and projected to another electronic terminal. And only the cut part of the extracted UI layer is enlarged and synthesized with a screen to be shared to generate a sharing screen to be projected to another electronic terminal. In other words, only a portion of the screen to be shared is enlarged and projected to another electronic terminal.
It should be understood that the respective units/modules in the shared screen processing apparatus according to the exemplary embodiments of the present disclosure may be implemented as hardware components and/or software components. The respective units/modules may be implemented, for example, using a Field Programmable Gate Array (FPGA) or an Application Specific Integrated Circuit (ASIC), by those skilled in the art according to the processes performed by the respective units/modules as defined.
The shared screen processing method according to the exemplary embodiments of the present disclosure may be written as a computer program, a code segment, an instruction, or any combination thereof, and recorded, stored, or fixed in or on one or more non-transitory computer-readable storage media. The non-transitory computer readable storage medium is any data storage device that can store data that is read by a computer system. Examples of computer-readable storage media include: read-only memory, random access memory, read-only optical disks, magnetic tapes, floppy disks, optical data storage devices, and carrier waves (such as data transmission through the internet via wired or wireless transmission paths).
An electronic terminal according to an exemplary embodiment of the present disclosure includes: a processor (not shown) and a memory (not shown), wherein the memory stores a computer program which, when executed by the processor, implements the shared screen processing method as in the above-described exemplary embodiments.
Although a few exemplary embodiments of the present disclosure have been shown and described, it would be appreciated by those skilled in the art that changes may be made in these embodiments without departing from the principles and spirit of the disclosure, the scope of which is defined in the claims and their equivalents.

Claims (10)

1. A shared screen processing method, characterized in that the shared screen processing method comprises:
extracting a user interface layer corresponding to a screen to be shared, and executing special effect processing on the user interface layer;
synthesizing the user interface layer which is subjected to the special effect processing with a screen to be shared to generate a shared screen; and
and performing mobile screen projection on the generated shared screen.
2. The shared-screen processing method of claim 1, wherein the step of extracting a user interface layer corresponding to the screen to be shared, and performing a special effect process on the user interface layer comprises: displaying the extracted user interface layer and providing a special effects menu including at least one special effect.
3. The shared-screen processing method of claim 2, wherein the step of extracting a user interface layer corresponding to the screen to be shared, and performing a special effect process on the user interface layer further comprises: performing a special effect process on the extracted user interface layer in response to a user selection of a special effect included in the special effect menu and an operation performed using the selected special effect.
4. The shared screen processing method of claim 3, wherein the special effects included in the special effects menu include mosaic, graffiti, sticker, cut, zoom, text, animation.
5. The shared screen processing method of claim 4, wherein the effect processing performed on the extracted user interface layer comprises: adding mosaic, doodle, characters and animation on the extracted user interface layer, cutting the extracted user interface layer, and/or scaling all or part of the extracted user interface layer.
6. The shared screen processing method of claim 1, wherein the step of combining the user interface layer on which the special effect process is performed with the screen to be shared comprises: and identifying the content displayed by the screen to be shared, and synthesizing the user interface layer which is subjected to special effect processing with the screen to be shared based on the identification result.
7. The shared-screen processing method of claim 6, wherein the step of combining the user interface layer on which the special effect process is performed with the screen to be shared comprises:
when the identification result indicates that the picture content is displayed on the screen to be shared, synthesizing the part of the user interface layer, which is subjected to the special effect processing, into the picture content; and
when the recognition result indicates that the screen to be shared displays video content, a portion of the user interface layer on which the special effect processing is performed is composited to the video content, and the performed special effect processing is canceled as the portion on which the special effect processing is performed disappears from the video content.
8. The shared-screen processing method of claim 6, wherein the step of combining the user interface layer on which the special effect process is performed with the screen to be shared further comprises:
when a portion on which the special effect processing is performed appears again in the video content after disappearing from the video content, the special effect processing performed on the portion is resumed.
9. The shared screen processing method of claim 1, wherein the step of performing the mobile screen projection on the generated shared screen comprises: and performing mobile screen projection on the generated sharing screen to share the generated sharing screen to the mobile electronic terminal or the fixed electronic terminal.
10. A shared screen processing apparatus, characterized in that the shared screen processing apparatus comprises:
the editing module is configured to extract a user interface layer corresponding to a screen to be shared and execute special effect processing on the user interface layer;
the synthesis module is configured to synthesize the user interface layer which is subjected to the special effect processing with a screen to be shared so as to generate a sharing screen; and
a transmitting module configured to perform mobile screen projection on the generated shared screen.
CN202010524355.6A 2020-06-10 2020-06-10 Shared screen processing apparatus and shared screen processing method Pending CN111741341A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010524355.6A CN111741341A (en) 2020-06-10 2020-06-10 Shared screen processing apparatus and shared screen processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010524355.6A CN111741341A (en) 2020-06-10 2020-06-10 Shared screen processing apparatus and shared screen processing method

Publications (1)

Publication Number Publication Date
CN111741341A true CN111741341A (en) 2020-10-02

Family

ID=72648681

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010524355.6A Pending CN111741341A (en) 2020-06-10 2020-06-10 Shared screen processing apparatus and shared screen processing method

Country Status (1)

Country Link
CN (1) CN111741341A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113179382A (en) * 2021-03-02 2021-07-27 广州朗国电子科技有限公司 Multi-conference equipment screen sharing method, equipment and storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103312804A (en) * 2013-06-17 2013-09-18 华为技术有限公司 Screen sharing method, associated equipment and communication system
CN104679395A (en) * 2013-11-26 2015-06-03 华为技术有限公司 Document presenting method and user terminal
US20160246483A1 (en) * 2015-02-23 2016-08-25 Fuji Xerox Co., Ltd. Display control device, communication terminal, and display control method
WO2017193530A1 (en) * 2016-05-12 2017-11-16 中兴通讯股份有限公司 Image generation method, device, and terminal
CN108124173A (en) * 2017-12-11 2018-06-05 深圳创维-Rgb电子有限公司 A kind of one-to-many throw shields display methods, system and storage medium
CN108924616A (en) * 2018-07-27 2018-11-30 维沃移动通信有限公司 A kind of display control method and terminal
US20190075340A1 (en) * 2017-09-01 2019-03-07 Christophe Michel Pierre Hochart Systems and methods for content delivery
CN110263191A (en) * 2019-06-24 2019-09-20 广州讯立享智能科技有限公司 A kind of the stacking display methods and system of multimedia resource
US20200013373A1 (en) * 2017-03-30 2020-01-09 Optim Corporation Computer system, screen sharing method, and program

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103312804A (en) * 2013-06-17 2013-09-18 华为技术有限公司 Screen sharing method, associated equipment and communication system
CN104679395A (en) * 2013-11-26 2015-06-03 华为技术有限公司 Document presenting method and user terminal
US20160246483A1 (en) * 2015-02-23 2016-08-25 Fuji Xerox Co., Ltd. Display control device, communication terminal, and display control method
WO2017193530A1 (en) * 2016-05-12 2017-11-16 中兴通讯股份有限公司 Image generation method, device, and terminal
US20200013373A1 (en) * 2017-03-30 2020-01-09 Optim Corporation Computer system, screen sharing method, and program
US20190075340A1 (en) * 2017-09-01 2019-03-07 Christophe Michel Pierre Hochart Systems and methods for content delivery
CN108124173A (en) * 2017-12-11 2018-06-05 深圳创维-Rgb电子有限公司 A kind of one-to-many throw shields display methods, system and storage medium
CN108924616A (en) * 2018-07-27 2018-11-30 维沃移动通信有限公司 A kind of display control method and terminal
CN110263191A (en) * 2019-06-24 2019-09-20 广州讯立享智能科技有限公司 A kind of the stacking display methods and system of multimedia resource

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113179382A (en) * 2021-03-02 2021-07-27 广州朗国电子科技有限公司 Multi-conference equipment screen sharing method, equipment and storage medium

Similar Documents

Publication Publication Date Title
KR102278932B1 (en) Application program processing method and terminal device
US11943486B2 (en) Live video broadcast method, live broadcast device and storage medium
US9142044B2 (en) Apparatus, systems and methods for layout of scene graphs using node bounding areas
KR100855611B1 (en) Method, apparatus and system for showing and editing multiple video streams on a small screen with a minimal input device
US10546557B2 (en) Removing overlays from a screen to separately record screens and overlays in a digital medium environment
JPWO2007004489A1 (en) Image processing apparatus and image processing method
CN106569700B (en) Screenshot method and screenshot device
US20170322680A1 (en) Method and apparatus for setting background of ui control, and terminal
CN104811797B (en) The method and mobile terminal of a kind of Video processing
CN111783175A (en) Display interface privacy protection method, terminal and computer readable storage medium
CN109753145B (en) Transition animation display method and related device
CN112887794A (en) Video editing method and device
CN111741341A (en) Shared screen processing apparatus and shared screen processing method
US20070211961A1 (en) Image processing apparatus, method, and program
CN107680038B (en) Picture processing method, medium and related device
CN111612875A (en) Dynamic image generation method and device, electronic equipment and storage medium
TWI514319B (en) Methods and systems for editing data using virtual objects, and related computer program products
CN112911367B (en) Video playing interface processing method and device and electronic equipment
KR20180023864A (en) Apparatus and method for creating image contents
KR20140127131A (en) Method for displaying image and an electronic device thereof
JP2008092316A (en) Display device
CN112035771A (en) Web-based camera data drawing method and device and electronic equipment
CN110619257A (en) Character area determining method and device
CN114339073B (en) Video generation method and video generation device
CN114979764B (en) Video generation method, device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20201002