WO2015067207A1 - Method and device for sharing live desktop information - Google Patents

Method and device for sharing live desktop information Download PDF

Info

Publication number
WO2015067207A1
WO2015067207A1 PCT/CN2014/090555 CN2014090555W WO2015067207A1 WO 2015067207 A1 WO2015067207 A1 WO 2015067207A1 CN 2014090555 W CN2014090555 W CN 2014090555W WO 2015067207 A1 WO2015067207 A1 WO 2015067207A1
Authority
WO
WIPO (PCT)
Prior art keywords
live
live area
desktop
drag
area
Prior art date
Application number
PCT/CN2014/090555
Other languages
French (fr)
Inventor
Mimi LU
Junyu CHEN
Original Assignee
Tencent Technology (Shenzhen) Company Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology (Shenzhen) Company Limited filed Critical Tencent Technology (Shenzhen) Company Limited
Publication of WO2015067207A1 publication Critical patent/WO2015067207A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/56Provisioning of proxy services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41407Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance embedded in a portable device, e.g. video client on a mobile phone, PDA, laptop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/4728End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for selecting a Region Of Interest [ROI], e.g. for requesting a higher resolution version of a selected region
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting

Definitions

  • the present disclosure relates to network technologies, and more particularly, to a method and device for sharing live desktop information.
  • terminal devices such as a mobile phone and a personal computer
  • people may not only transmit and share data, make video calling, but also live broadcast desktop activities of a terminal device, and share the operations of the terminal device with the opposite.
  • the methods for live broadcasting desktop activities include desktop full-screen live broadcast and software process live broadcast.
  • a method and device for sharing live desktop information is provided to protect the privacy and improve the flexibility.
  • the method for sharing live desktop information includes: determining a live area on the desktop of a first terminal; the live area is one part of the desktop; capturing image data in the live area during the process of desktop live broadcast; and sending, by the first terminal, captured image data to a video server, so that the video server processes the image data, and provides a second terminal with video data obtained after the processing.
  • the device for sharing live desktop information includes: a determining module, configured to determine a live area on the desktop of a first terminal; the live area is one part of the desktop; a capturing module, configured to capture image data in the live area during the process of desktop live broadcast; and a sending module, configured to send captured image data to a video server, so that the video server processes the image data, and provides a second terminal with video data obtained after the processing.
  • the first terminal may determine the live area on the desktop of the first terminal, only capture image data in the live area during the process of desktop live broadcast, and send captured image data to the video server, so that the video server may process the image data, and provide video data obtained after the processing for the second terminal.
  • the user of the second terminal is only able to watch information in the live area on the desktop of the first terminal, and is not able to watch information outside the live area on the desktop, thus the privacy may be protected.
  • the user of the second terminal may watch information in which area on the desktop of the first terminal is determined by the live area, and has nothing to do with software windows which is activated on the desktop.
  • Multiple software interfaces in the live area can be broadcasted live during one live broadcast.
  • the switch among software may be live broadcasted, so that operation interfaces of different software may be shown at the same time. Therefore, the flexibility is improved, and the application scope is broadened.
  • Figure 1 is a flow diagram illustrating a method for sharing live desktop information according to an example of the present disclosure.
  • Figure 2 is a flow diagram illustrating a method for sharing live desktop information according to another example of the present disclosure.
  • Figure 3 is a schematic diagram illustrating a live interface according to an example of the present disclosure.
  • Figure 4 is a flow diagram illustrating interactive operations among a first terminal, a video server and a second terminal according to an example of the present disclosure.
  • Figure 5 is a schematic diagram illustrating a structure of a device for sharing live desktop information according to another example of the present disclosure.
  • Figure 6 is a schematic diagram illustrating another structure of the device for sharing live desktop information according to an example of the present disclosure.
  • Figure 1 is a flow diagram illustrating a method for sharing live desktop information according to an example of the present disclosure. As shown in figure 1, the method may include the following operations.
  • a live area is determined on the desktop of the first terminal.
  • the live area is one part of the desktop.
  • image data in the live area is captured during the process of desktop live broadcast.
  • the first terminal sends captured image data to a video server, so that the video server may process the image data, and provide a second terminal with video data obtained after the processing.
  • the first terminal may determine the live area on the desktop of the first terminal, only capture image data in the live area during the process of desktop live broadcast, and send captured image data to the video server, so that the video server may process the image data, and provide video data obtained after the processing for the second terminal.
  • the user of the second terminal is only able to watch information in the live area on the desktop of the first terminal, and is not able to watch information outside the live area on the desktop, which protects the privacy.
  • the user of the second terminal may watch information in which scope on the desktop of the first terminal is determined by the live area, and has nothing to do with software windows which is activated on the desktop.
  • Multiple software interfaces in the live area may be broadcasted live in one live broadcast.
  • the switch among software may be live broadcasted to the opposite, so that operation interfaces of different software may be shown at the same time. Therefore, the flexibility is improved, and the application scope is broadened.
  • the live area is determined on the desktop of the first terminal may include the followings.
  • the start position of the drag-and-drop operation on the desktop is taken as a first position
  • the final position of the drag-and-drop operation on the desktop is taken as a second position
  • a rectangular area of which the diagonal vertexes are the first position and the second position is obtained; and the rectangular area of which the diagonal vertexes are the first position and the second position is determined as the live area.
  • the method may further include the followings.
  • the method may further include the followings.
  • the position of the live area is adjusted, may include the followings.
  • the position of the live area is adjusted to the current position of the live area border on the desktop; or every time the distance the live area border is dragged-and-dropped is larger than or equal to a first preset threshold, the position of the live area is adjusted to the current position of the live area border on the desktop.
  • the method may further include the followings.
  • the size of the live area is adjusted, may include the followings.
  • the size of the live area is adjusted to the current size of the live area border on the desktop; or every time the distance the any one side or any one vertex of the live area border is dragged-and-dropped is larger than or equal to a second preset threshold, the size of the live area is adjusted to the current size of the live area border on the desktop.
  • Figure 2 is a flow diagram illustrating a method for sharing live desktop information according to another example of the present disclosure.
  • the executive subject for executing the method may be the first terminal.
  • the method may include the following operations.
  • a live area is determined on the desktop of the first terminal.
  • the live area is one part of the desktop.
  • the live area may be determined on the desktop of the first terminal by a user.
  • the first terminal When desktop information is live broadcasted, the first terminal only live broadcasts display interface in the live area on the desktop to the second terminal, and does not live broadcast display interface outside of the live area on the desktop.
  • the second terminal may only watch desktop information in the live area and not watch desktop information outside of the live area on the desktop of the first terminal.
  • the user When determining the live area, the user may exclude the privacy part of the user on the desktop and other information not wanting the second terminal to watch from the live area. Thus the privacy of the user is protected.
  • the second terminal may be a terminal device which is a watching party during the process of desktop live broadcast performed by the first terminal.
  • the block 201 may include the followings.
  • a drag-and-drop operation When a drag-and-drop operation is detected, the start position of the drag-and-drop operation on the desktop is taken as a first position, and the final position of the drag-and-drop operation on the desktop is taken as a second position, a rectangular area of which the diagonal vertexes are the first position and the second position is obtained; and the rectangular area of which the diagonal vertexes are the first position and the second position is determined as the live area.
  • the drag-and-drop operation may be triggered by the user using fingers on the touch screen.
  • the drag-and-drop operation may also be triggered by a control device of the terminal, such as a mouse or a tablet.
  • Figure 3 is a schematic diagram illustrating a live area on a live interface according to an example of the present disclosure.
  • the first position 32 in figure 3 is the start position of the drag-and-drop operation
  • the second position 33 in figure 3 is the final position of the drag-and-drop operation, according to the first position 32 and the second position 33
  • the first terminal may determine a rectangular area of which the diagonal vertexes are the first position and the second position as the live area 34.
  • the direction of the drag-and-drop operation may be from upper left to lower right, from lower left to upper right, from upper right to lower left and from lower right to upper left.
  • the first terminal may, by monitoring a LButtonDown message of the mouse, record a first position (screen coordinates) PtMouseStart on the desktop corresponding to the mouse when the LButtonDown message occurs.
  • the first terminal may monitor a MouseMove event of the mouse, and draw the border of the live area in real time according to the position of the mouse.
  • the first terminal may record a second position (screen coordinates) PtMouseEnd on the desktop corresponding to the mouse.
  • the first terminal may calculate a rectangular area SelRect bounded by the live area border.
  • the rectangular area SelRect is the live area.
  • the first terminal may pass on the live area to a video engine by taking the live area as a parameter.
  • the video engine acquires image data in the live area.
  • the video engine is one part of the first terminal.
  • a rectangular box of which the diagonal vertexes are the first position and the current position of the drag-and-drop operation on the desktop is displayed. That is to say, during the process of the drag-and-drop operation, the size of the rectangular box changes constantly accompanying with the drag-and-drop operation, thus the user may has an intuitive visual perception about the size and scope of the live area, so that the user may determine the final live area.
  • image data in the live area is captured during the process of desktop live broadcast.
  • block 203 is performed; when a live broadcast stop operation is detected, block 206 is performed.
  • a video engine of the first terminal captures image data in the live area.
  • the image data within the live area may include but be not limited to all static and dynamic image data shown in the live area, such as desktop background, application window, desktop icon, and mouse pointer location.
  • the live interface includes but is not limited to “live broadcast start” and “live broadcast stop” options.
  • the live area is determined, when detecting a click operation on the “live broadcast start” option, the first terminal determines that the live broadcast start operation is detected, and block 203 is performed.
  • the first terminal determines that the live broadcast stop operation is detected, and block 206 is performed.
  • the first terminal sends captured image data to a video server, so that the video server may process the image data, and provide a second terminal with video data obtained after the processing.
  • the first terminal After the live area is determined, and when the live broadcast start operation is detected, the first terminal sends captured image data to a video server, so that the video server may process the image data, and provide a second terminal with video data obtained after the processing.
  • FIG. 4 is a flow diagram illustrating interactive operations among the first terminal, the video server and the second terminal according to an example of the present disclosure.
  • the video server includes a signaling service module and a video service module.
  • the first terminal sends a live broadcast request message to the video server, and after receiving a startup success message returned by the video server, starts the live broadcast.
  • the first terminal sends captured image data in the live area to the video server, and the video server sends the second terminal a message for informing the second terminal to receive live broadcast content, processes received image data, and delivers processed image data to the second terminal.
  • the second terminal may play the image data, so that a holder of the second terminal may watch the image data.
  • the first terminal stops sending image data, and sends a live broadcast stop message to the video server, and the video server stops delivering image data to the second terminal.
  • the position of the live area may be adjusted through the drag-and-drop operation on the live area border.
  • the drag-and-drop operation may be direct drag and drop performed on the live area border, or may be drag and drop performed on a drag-and-drop button on the live interface.
  • the live interface further includes a drag-and-drop button 35.
  • the first terminal may determine that a drag-and-drop operation on the live area border is detected, and according to the movement path of the drag-and-drop operation on the drag-and-drop button 35, namely according to the movement path of the drag-and-drop operation on the live area border, adjusts the position of the live area, displays the position of the live area during the process of dragging, perform image data capturing according to adjusted live area.
  • the first terminal may, by monitoring a LButtonDown message of the mouse, record a first position (screen coordinates) PtMouseStart on the desktop corresponding to the mouse when the drag-and-drop operation on the live area border occurs; by monitoring a MouseMove event of the mouse, draw the border of the live area in real time according to the position of the mouse.
  • the first terminal records a current position PtMouseCur on the desktop corresponding to the mouse in real time, and according to the current position PtMouseCur, the first positon PtMouseStart, and a rectangular area SelRect bounded by the live area border, calculates a current live area, and simultaneously updates the value of SelRect. Subsequently, a video engine captures image data in the current live area.
  • the live area changes constantly during the drag-and-drop operation on the live area border, in order to avoid the poor performance resulted from that the video engine frequently responds to the change of the live area, there may be following two methods for achieving the block 204.
  • the live area determined before a drag-and-drop operation occurs is called a first live area
  • a timer is started, when the length of the timer reaches the first preset time interval, the area bounded by the current border of the live area on the desktop is determined as a second live area.
  • the first terminal starts to capture image data in the second live area through a video engine, and the timer is restarted, when the length of the timer reaches the first preset time interval again, the area bounded by the current border of the live area on the desktop is determined as a third live area, and so on. 2) Every time the distance the live area border is dragged-and-dropped is larger than or equal to a first preset threshold, position of the live area is adjusted to the current position of the live area border on the desktop.
  • the live area determined before a drag-and-drop operation occurs is called a first live area
  • a drag-and-drop operation on the border of the first live area is detected, and when the distance the first live area border is dragged-and-dropped is larger than or equal to a first preset threshold
  • the area bounded by the current border of the live area on the desktop is determined as a second live area
  • the first terminal starts to capture image data in the second live area through a video engine
  • the distance the second live area border is dragged-and-dropped is larger than or equal to a first preset threshold
  • the area bounded by the current border of the live area on the desktop is determined as a third live area
  • the first terminal starts to capture image data in the third live area through the video engine, and so on.
  • the live interface includes a drag-and-drop button, and when a drag-and-drop operation on the drag-and-drop button is detected, the first terminal determines that a drag-and-drop operation on the border of the first live area is detected.
  • the live interface may further provide a button for selecting the whole live area border, or other manner for selecting the whole live area border, so as to perform a drag-and-drop operation on the live area border. No further descriptions will be given hereinafter.
  • the size of live area may be adjusted through a drag-and-drop operation on any one side or any one vertex of the live area border.
  • a drag-and-drop operation on any one side or any one vertex of the live area border is detected, according to the movement path of the drag-and-drop operation on the side or vertex of the live area border, the size of the live area is adjusted to the current size of the live area border on the desktop.
  • the position of the live area is displayed during the process of dragging, and image data capturing is performed according to adjusted live area.
  • the size of the live area is adjusted to the current size of the live area border on the desktop.
  • the size of the live area is adjusted to the current size of the live area border on the desktop.
  • the process for adjusting the size of the live area in block 205 may be similar to the process for adjusting the position of the live area in block 204, and no further descriptions will be given hereinafter.
  • a “live area enlarge” button and a “live area reduce” button may be further provided, when the “live area enlarge” button is clicked, the live area is enlarged according to a preset magnification factor.
  • the “live area reduce” button is clicked, the live area is reduced according to a preset minification factor.
  • the adjustment for the size of the live area may also be triggered in other manners, and examples of the present disclosure do not qualify how the adjustment of the live area is triggered.
  • the live broadcast stop option on the live interface (as shown in figure 3) is selected, the live broadcast of image data in the live area is stopped, and the live interface is shut off.
  • the first terminal may determine the live area on the desktop of the first terminal, only capture image data in the live area during the process of desktop live broadcast, and send captured image data to the video server, so that the video server may process the image data, and provide video data obtained after the processing for the second terminal.
  • the user of the second terminal is only able to watch information in the live area on the desktop of the first terminal, and is not able to watch information outside the live area on the desktop, which protects the privacy.
  • the user of the second terminal may watch information in which area on the desktop of the first terminal is determined by the live area, and has nothing to do with software windows which is activated on the desktop.
  • Multiple software interfaces in the live area may be broadcasted live in one live broadcast.
  • the switch among software may be live broadcasted to the opposite, so that operation interfaces of different software may be shown at the same time. Therefore, the flexibility is improved, and the application scope is broadened.
  • Figure 5 is a schematic diagram illustrating a structure of a device for sharing live desktop information according to another example of the present disclosure.
  • the device may include a determining module 501, a capturing module 502 and a sending module 503.
  • the determining module 501 is configured to determine a live area on the desktop of a first terminal.
  • the live area is one part of the desktop.
  • the capturing module 502 is configured to capture image data in the live area during the process of desktop live broadcast.
  • the sending module 503 is configured to send captured image data to a video server, so that the video server may process the image data, and provide a second terminal with video data obtained after the processing.
  • the determining module 501 may include an obtaining unit and a determining unit.
  • the obtaining unit is configured to, when a drag-and-drop operation is detected, take the start position of the drag-and-drop operation on the desktop as a first position, and take the final position of the drag-and-drop operation on the desktop as a second position, obtain a rectangular area of which the diagonal vertexes are the first position and the second position.
  • the determining unit is configured to determine the rectangular area of which the diagonal vertexes are the first position and the second position as the live area.
  • the device may further include a displaying module, configured to, during the process of the drag-and-drop operation, according to the movement path of the drag-and-drop operation, display a rectangular box of which the diagonal vertexes are the first position and the current position of the drag-and-drop operation on the desktop.
  • a displaying module configured to, during the process of the drag-and-drop operation, according to the movement path of the drag-and-drop operation, display a rectangular box of which the diagonal vertexes are the first position and the current position of the drag-and-drop operation on the desktop.
  • the device may further include a first adjusting module, configured to, when a drag-and-drop operation on the live area border is detected, according to the movement path of the drag-and-drop operation on the live area border, adjust the position of the live area, and perform image data capturing according to adjusted live area.
  • a first adjusting module configured to, when a drag-and-drop operation on the live area border is detected, according to the movement path of the drag-and-drop operation on the live area border, adjust the position of the live area, and perform image data capturing according to adjusted live area.
  • the first adjusting module may include a first adjusting unit or a second adjusting unit.
  • the first adjusting unit is configured to, every a first present time interval, adjust the position of the live area to the current position of the live area border on the desktop.
  • the second adjusting unit is configured to, every time the distance the live area border is dragged-and-dropped is larger than or equal to a first preset threshold, adjust the position of the live area to the current position of the live area border on the desktop.
  • the device may further include a second adjusting module, configured to, when a drag-and-drop operation on any one side or any one vertex of the live area border is detected, according to the movement path of the drag-and-drop operation on the side or vertex of the live area border, adjust the size of the live area, and perform image data capturing according to adjusted live area.
  • a second adjusting module configured to, when a drag-and-drop operation on any one side or any one vertex of the live area border is detected, according to the movement path of the drag-and-drop operation on the side or vertex of the live area border, adjust the size of the live area, and perform image data capturing according to adjusted live area.
  • the second adjusting module may include a third adjusting unit or a fourth adjusting unit.
  • the third adjusting unit is configured to, every a second present time interval, adjust the size of the live area to the current size of the live area border on the desktop.
  • the fourth adjusting unit is configured to, every time the distance the any one side or any one vertex of the live area border is dragged-and-dropped is larger than or equal to a second preset threshold, adjust the size of the live area to the current size of the live area border on the desktop.
  • the modules and units included in the device are divided according to function logic, and not limited to the above division, as long as the corresponding functions may be achieved.
  • the modules and units in the device may be distributed in the device of the examples according to example descriptions, or may change correspondingly to locate in one or more devices different from the examples.
  • the modules and units in above examples may be merged into one module, or may be divided into multiple sub-modules furthermore.
  • names of the units are only used to easily distinguish from each other, and not intended to limit the scope of the present disclosure.
  • the above examples may be implemented by hardware, software, firmware, or a combination thereof.
  • the various methods, processes and functional modules described herein may be implemented by a processor (the term processor is to be interpreted broadly to include a CPU, processing unit/module, ASIC, logic module, or programmable gate array, etc. ) .
  • the processes, methods and functional modules may all be performed by a single processor or split between several processors; reference in this disclosure or the claims to a ‘processor’ s hould thus be interpreted to mean ‘one or more processors’ .
  • the processes, methods and functional modules are implemented as machine readable instructions executable by one or more processors, hardware logic circuitry of the one or more processors or a combination thereof.
  • the modules may be combined into one module or further divided into a plurality of sub-modules.
  • the examples disclosed herein may be implemented in the form of a software product.
  • the computer software product is stored in a non-transitory storage medium and comprises a plurality of instructions for making an electronic device implement the method recited in the examples of the present disclosure.
  • the non-transitory storage medium includes a hard disk, a floppy disk, a magnetic disk, a compact disk (e.g., CD-ROM, CD-R, CD-RW, DVD-ROM, DVD-RAM, DVD-RW and DVD+RW) , a tape, a Flash card, ROM, and so on.
  • figure 6 is a schematic diagram illustrating another structure of the device for sharing live desktop information according to an example of the present disclosure.
  • the device may include a memory 61 and a processor 62 in communication with the memory 61.
  • the memory 61 may store a group of instructions which may be executed by the processor 62 to implement the operations of modules and units of any one of the devices mentioned above.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Information Transfer Between Computers (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Examples of the present disclosure provide a method and device for sharing live desktop information. The method includes: determining a live area on the desktop of a first terminal; the live area is one part of the desktop; capturing image data in the live area during the process of desktop live broadcast; sending captured image data from the first terminal to a video server, so that the video server processes the image data, and provides video data obtained after the processing for a second terminal. Thus a user of the second terminal is only able to watch information in the live area on the desktop of the first terminal, which protects the privacy; in addition, multiple software interfaces in the live area can be shown live during one live broadcast.

Description

METHOD AND DEVICE FOR SHARING LIVE DESKTOP INFORMATION
This application claims the benefit of Chinese Patent Application No. 201310554999.X, filed on November 8, 2013, the disclosure of which is incorporated herein in its entirety by reference.
FIELD
The present disclosure relates to network technologies, and more particularly, to a method and device for sharing live desktop information.
BACKGROUND
With the development of network technologies, by utilizing terminal devices, such as a mobile phone and a personal computer, people may not only transmit and share data, make video calling, but also live broadcast desktop activities of a terminal device, and share the operations of the terminal device with the opposite.
At present, the methods for live broadcasting desktop activities include desktop full-screen live broadcast and software process live broadcast.
In the desktop full-screen live broadcast, all activities on the desktop of the terminal device are live broadcasted to the opposite. The activities include a lot of irrelevant information the opposite do not need to know, even include personal privacy. Thus the privacy is lacked. In the software process live broadcast, only one specific software interface may be live broadcasted to the opposite, and multiple software interfaces is not able to be live broadcasted in the same live time. Thus the switch among software cannot be live broadcasted to the opposite, so that operation interfaces of different software cannot be shown at the same time. Therefore, the flexibility is poor, and the application scope is narrow.
SUMMARY
According to examples of the present disclosure, a method and device for sharing live desktop information is provided to protect the privacy and improve the flexibility.
The method for sharing live desktop information provided by an example of the present disclosure includes: determining a live area on the desktop of a first terminal; the live  area is one part of the desktop; capturing image data in the live area during the process of desktop live broadcast; and sending, by the first terminal, captured image data to a video server, so that the video server processes the image data, and provides a second terminal with video data obtained after the processing.
The device for sharing live desktop information provided by an example of the present disclosure includes: a determining module, configured to determine a live area on the desktop of a first terminal; the live area is one part of the desktop; a capturing module, configured to capture image data in the live area during the process of desktop live broadcast; and a sending module, configured to send captured image data to a video server, so that the video server processes the image data, and provides a second terminal with video data obtained after the processing.
In the present disclosure, the first terminal may determine the live area on the desktop of the first terminal, only capture image data in the live area during the process of desktop live broadcast, and send captured image data to the video server, so that the video server may process the image data, and provide video data obtained after the processing for the second terminal. Thus the user of the second terminal is only able to watch information in the live area on the desktop of the first terminal, and is not able to watch information outside the live area on the desktop, thus the privacy may be protected. In addition, the user of the second terminal may watch information in which area on the desktop of the first terminal is determined by the live area, and has nothing to do with software windows which is activated on the desktop. Multiple software interfaces in the live area can be broadcasted live during one live broadcast. Thus the switch among software may be live broadcasted, so that operation interfaces of different software may be shown at the same time. Therefore, the flexibility is improved, and the application scope is broadened.
BRIEF DESCRIPTION OF THE DRAWINGS
In order to make technical solutions of examples of the present disclosure clearer, accompanying drawings to be used in description of the examples will be simply introduced hereinafter. Obviously, the accompanying drawings to be described hereinafter are only some examples of the present invention. Those skilled in the art may obtain other drawings according to these accompanying drawings without creative labor.
Figure 1 is a flow diagram illustrating a method for sharing live desktop information according to an example of the present disclosure.
Figure 2 is a flow diagram illustrating a method for sharing live desktop information according to another example of the present disclosure.
Figure 3 is a schematic diagram illustrating a live interface according to an example of the present disclosure.
Figure 4 is a flow diagram illustrating interactive operations among a first terminal, a video server and a second terminal according to an example of the present disclosure.
Figure 5 is a schematic diagram illustrating a structure of a device for sharing live desktop information according to another example of the present disclosure.
Figure 6 is a schematic diagram illustrating another structure of the device for sharing live desktop information according to an example of the present disclosure.
DETAILED DESCRIPTION
Reference will now be made in detail to examples, which are illustrated in the accompanying drawings. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure. Also, the figures are illustrations of an example, in which modules or procedures shown in the figures are not necessarily essential for implementing the present disclosure. In other instances, well-known methods, procedures, components, and circuits have not been described in detail so as not to unnecessarily obscure aspects of the examples. As used herein, the term “includes” means includes but not limited to, the term “including” means including but not limited to. The term “based on” means based at least in part on. In addition, the terms “a” and “an” are intended to denote at least one of a particular element.
Figure 1 is a flow diagram illustrating a method for sharing live desktop information according to an example of the present disclosure. As shown in figure 1, the method may include the following operations.
In block 101, a live area is determined on the desktop of the first terminal. The live area is one part of the desktop.
In block 102, image data in the live area is captured during the process of desktop live broadcast.
In block 103, the first terminal sends captured image data to a video server, so that the video server may process the image data, and provide a second terminal with video data obtained after the processing.
In the example, the first terminal may determine the live area on the desktop of the first terminal, only capture image data in the live area during the process of desktop live broadcast, and send captured image data to the video server, so that the video server may process the image data, and provide video data obtained after the processing for the second terminal. Thus the user of the second terminal is only able to watch information in the live area on the desktop of the first terminal, and is not able to watch information outside the live area on the desktop, which protects the privacy. In addition, the user of the second terminal may watch information in which scope on the desktop of the first terminal is determined by the live area, and has nothing to do with software windows which is activated on the desktop. Multiple software interfaces in the live area may be broadcasted live in one live broadcast. Thus the switch among software may be live broadcasted to the opposite, so that operation interfaces of different software may be shown at the same time. Therefore, the flexibility is improved, and the application scope is broadened.
In another example, the live area is determined on the desktop of the first terminal may include the followings.
When a drag-and-drop operation is detected, the start position of the drag-and-drop operation on the desktop is taken as a first position, and the final position of the drag-and-drop operation on the desktop is taken as a second position, a rectangular area of which the diagonal vertexes are the first position and the second position is obtained; and the rectangular area of which the diagonal vertexes are the first position and the second position is determined as the live area.
In another example, the method may further include the followings.
During the process of the drag-and-drop operation, according to the movement path of the drag-and-drop operation, a rectangular box of which the diagonal vertexes are the first position and the current position of the drag-and-drop operation on the desktop is displayed.
In another example, after the live area is determined on the desktop of the first terminal, the method may further include the followings.
When a drag-and-drop operation on the live area border is detected, according to the movement path of the drag-and-drop operation on the live area border, the position of the live area is adjusted, and image data capturing is performed according to adjusted live area.
In another example, when the drag-and-drop operation on the live area border is detected, according to the movement path of the drag-and-drop operation on the live area border, the position of the live area is adjusted, may include the followings.
Every a first preset time interval, the position of the live area is adjusted to the current position of the live area border on the desktop; or every time the distance the live area border is dragged-and-dropped is larger than or equal to a first preset threshold, the position of the live area is adjusted to the current position of the live area border on the desktop.
In another example, after the live area is determined on the desktop of the first terminal, the method may further include the followings.
When a drag-and-drop operation on any one side or any one vertex of the live area border is detected, according to the movement path of the drag-and-drop operation on the side or vertex of the live area border, the size of the live area is adjusted, and image data capturing is performed according to adjusted live area.
In another example, when a drag-and-drop operation on any one side or any one vertex of the live area border is detected, according to the movement path of the drag-and-drop operation on the side or vertex of the live area border, the size of the live area is adjusted, may include the followings.
Every a second present time interval, the size of the live area is adjusted to the current size of the live area border on the desktop; or every time the distance the any one side or any one vertex of the live area border is dragged-and-dropped is larger than or equal to a second  preset threshold, the size of the live area is adjusted to the current size of the live area border on the desktop.
Above mentioned all examples may be combined in any manner to generate an optional example of the present disclosure, and no further descriptions will be given hereinafter.
Figure 2 is a flow diagram illustrating a method for sharing live desktop information according to another example of the present disclosure. The executive subject for executing the method may be the first terminal. As shown in figure 2, the method may include the following operations.
In block 201, a live area is determined on the desktop of the first terminal. The live area is one part of the desktop.
When an application for live broadcasting desktop information is started, the live area may be determined on the desktop of the first terminal by a user. When desktop information is live broadcasted, the first terminal only live broadcasts display interface in the live area on the desktop to the second terminal, and does not live broadcast display interface outside of the live area on the desktop. Thus through the desktop live broadcast, the second terminal may only watch desktop information in the live area and not watch desktop information outside of the live area on the desktop of the first terminal. When determining the live area, the user may exclude the privacy part of the user on the desktop and other information not wanting the second terminal to watch from the live area. Thus the privacy of the user is protected. The second terminal may be a terminal device which is a watching party during the process of desktop live broadcast performed by the first terminal.
The block 201 may include the followings. When a drag-and-drop operation is detected, the start position of the drag-and-drop operation on the desktop is taken as a first position, and the final position of the drag-and-drop operation on the desktop is taken as a second position, a rectangular area of which the diagonal vertexes are the first position and the second position is obtained; and the rectangular area of which the diagonal vertexes are the first position and the second position is determined as the live area. For a touch screen, the drag-and-drop operation may be triggered by the user using fingers on the touch screen.  Of course, the drag-and-drop operation may also be triggered by a control device of the terminal, such as a mouse or a tablet.
Figure 3 is a schematic diagram illustrating a live area on a live interface according to an example of the present disclosure. As shown in figure 3, on the desktop 31, the first position 32 in figure 3 is the start position of the drag-and-drop operation, the second position 33 in figure 3 is the final position of the drag-and-drop operation, according to the first position 32 and the second position 33, the first terminal may determine a rectangular area of which the diagonal vertexes are the first position and the second position as the live area 34.
In a practical system, the direction of the drag-and-drop operation may be from upper left to lower right, from lower left to upper right, from upper right to lower left and from lower right to upper left.
For a terminal of which the control device is a mouse, the first terminal may, by monitoring a LButtonDown message of the mouse, record a first position (screen coordinates) PtMouseStart on the desktop corresponding to the mouse when the LButtonDown message occurs. When the current state is that the live area is selected, the first terminal may monitor a MouseMove event of the mouse, and draw the border of the live area in real time according to the position of the mouse. During this process, when receiving a LButtonUp message (the drag-and-drop operation is over) of the mouse, the first terminal may record a second position (screen coordinates) PtMouseEnd on the desktop corresponding to the mouse. According to the first position PtMouseStart and the second position PtMouseEnd, the first terminal may calculate a rectangular area SelRect bounded by the live area border. The rectangular area SelRect is the live area. The first terminal may pass on the live area to a video engine by taking the live area as a parameter. The video engine acquires image data in the live area. The video engine is one part of the first terminal.
In addition, during the process of the drag-and-drop operation, according to the movement path of the drag-and-drop operation, a rectangular box of which the diagonal vertexes are the first position and the current position of the drag-and-drop operation on the desktop is displayed. That is to say, during the process of the drag-and-drop operation, the size of the rectangular box changes constantly accompanying with the drag-and-drop  operation, thus the user may has an intuitive visual perception about the size and scope of the live area, so that the user may determine the final live area.
In block 202, image data in the live area is captured during the process of desktop live broadcast. When a live broadcast start operation is detected, block 203 is performed; when a live broadcast stop operation is detected, block 206 is performed.
When the first terminal determines the live area, according to the position of the live area, a video engine of the first terminal captures image data in the live area. The image data within the live area may include but be not limited to all static and dynamic image data shown in the live area, such as desktop background, application window, desktop icon, and mouse pointer location.
As shown in figure 3, the live interface includes but is not limited to “live broadcast start” and “live broadcast stop” options. After the live area is determined, when detecting a click operation on the “live broadcast start” option, the first terminal determines that the live broadcast start operation is detected, and block 203 is performed. When detecting a click operation on the “live broadcast stop” option, the first terminal determines that the live broadcast stop operation is detected, and block 206 is performed.
In block 203, the first terminal sends captured image data to a video server, so that the video server may process the image data, and provide a second terminal with video data obtained after the processing.
After the live area is determined, and when the live broadcast start operation is detected, the first terminal sends captured image data to a video server, so that the video server may process the image data, and provide a second terminal with video data obtained after the processing.
Figure 4 is a flow diagram illustrating interactive operations among the first terminal, the video server and the second terminal according to an example of the present disclosure. The video server includes a signaling service module and a video service module. As shown in figure 4, the first terminal sends a live broadcast request message to the video server, and after receiving a startup success message returned by the video server, starts the live broadcast. The first terminal sends captured image data in the live area to the video server,  and the video server sends the second terminal a message for informing the second terminal to receive live broadcast content, processes received image data, and delivers processed image data to the second terminal. The second terminal may play the image data, so that a holder of the second terminal may watch the image data. When detecting a live broadcast stop operation, the first terminal stops sending image data, and sends a live broadcast stop message to the video server, and the video server stops delivering image data to the second terminal.
In block 204, when a drag-and-drop operation on the live area border is detected, according to the movement path of the drag-and-drop operation on the live area border, the position of the live area is adjusted, and image data capturing is performed according to adjusted live area.
During the process of desktop live broadcast, the position of the live area may be adjusted through the drag-and-drop operation on the live area border. The drag-and-drop operation may be direct drag and drop performed on the live area border, or may be drag and drop performed on a drag-and-drop button on the live interface. As shown in figure 3, the live interface further includes a drag-and-drop button 35. During the process of desktop live broadcast, when detecting a drag-and-drop operation on the drag-and-drop button 35, the first terminal may determine that a drag-and-drop operation on the live area border is detected, and according to the movement path of the drag-and-drop operation on the drag-and-drop button 35, namely according to the movement path of the drag-and-drop operation on the live area border, adjusts the position of the live area, displays the position of the live area during the process of dragging, perform image data capturing according to adjusted live area.
In a practical system, for a terminal of which the control device is a mouse, after the live area is determined, the first terminal may, by monitoring a LButtonDown message of the mouse, record a first position (screen coordinates) PtMouseStart on the desktop corresponding to the mouse when the drag-and-drop operation on the live area border occurs; by monitoring a MouseMove event of the mouse, draw the border of the live area in real time according to the position of the mouse. During this process, the first terminal records a current position PtMouseCur on the desktop corresponding to the mouse in real time, and according to the current position PtMouseCur, the first positon PtMouseStart, and a rectangular area SelRect bounded by the live area border, calculates a current live area, and  simultaneously updates the value of SelRect. Subsequently, a video engine captures image data in the current live area.
Since the live area changes constantly during the drag-and-drop operation on the live area border, in order to avoid the poor performance resulted from that the video engine frequently responds to the change of the live area, there may be following two methods for achieving the block 204. 1) Every a first present time interval, position of the live area is adjusted to the current position of the live area border on the desktop. For example, the live area determined before a drag-and-drop operation occurs is called a first live area, when a drag-and-drop operation on the border of the first live area is detected, a timer is started, when the length of the timer reaches the first preset time interval, the area bounded by the current border of the live area on the desktop is determined as a second live area. The first terminal starts to capture image data in the second live area through a video engine, and the timer is restarted, when the length of the timer reaches the first preset time interval again, the area bounded by the current border of the live area on the desktop is determined as a third live area, and so on. 2) Every time the distance the live area border is dragged-and-dropped is larger than or equal to a first preset threshold, position of the live area is adjusted to the current position of the live area border on the desktop. For instance, the live area determined before a drag-and-drop operation occurs is called a first live area, when a drag-and-drop operation on the border of the first live area is detected, and when the distance the first live area border is dragged-and-dropped is larger than or equal to a first preset threshold, the area bounded by the current border of the live area on the desktop is determined as a second live area, the first terminal starts to capture image data in the second live area through a video engine, when the distance the second live area border is dragged-and-dropped is larger than or equal to a first preset threshold, the area bounded by the current border of the live area on the desktop is determined as a third live area, the first terminal starts to capture image data in the third live area through the video engine, and so on.
It should be noted that, it is only taken as an example in figure 3 that the live interface includes a drag-and-drop button, and when a drag-and-drop operation on the drag-and-drop button is detected, the first terminal determines that a drag-and-drop operation on the border of the first live area is detected. In other examples of the present disclosure, the live interface may further provide a button for selecting the whole live area border, or other manner for  selecting the whole live area border, so as to perform a drag-and-drop operation on the live area border. No further descriptions will be given hereinafter.
In block 205, when a drag-and-drop operation on any one side or any one vertex of the live area border is detected, according to the movement path of the drag-and-drop operation on the side or vertex of the live area border, the size of the live area is adjusted, and image data capturing is performed according to adjusted live area.
During the process of desktop live broadcast, the size of live area may be adjusted through a drag-and-drop operation on any one side or any one vertex of the live area border. When a drag-and-drop operation on any one side or any one vertex of the live area border is detected, according to the movement path of the drag-and-drop operation on the side or vertex of the live area border, the size of the live area is adjusted to the current size of the live area border on the desktop. The position of the live area is displayed during the process of dragging, and image data capturing is performed according to adjusted live area.
Since the live area changes constantly during the drag-and-drop operation on the live area border, in order to avoid the poor performance resulted from that the video engine frequently responds to the change of the live area, there may be following two methods for achieving the block 205. 1) Every a second present time interval, the size of the live area is adjusted to the current size of the live area border on the desktop. 2) Every time the distance the any one side or any one vertex of the live area border is dragged-and-dropped is larger than or equal to a second preset threshold, the size of the live area is adjusted to the current size of the live area border on the desktop.
In a practical system, the process for adjusting the size of the live area in block 205 may be similar to the process for adjusting the position of the live area in block 204, and no further descriptions will be given hereinafter.
It should be noted that, it is only taken as an example that the drag-and-drop operation on any one side or any one vertex of the live area border is performed by a mouse. In other examples of the present disclosure, a “live area enlarge” button and a “live area reduce” button may be further provided, when the “live area enlarge” button is clicked, the live area is enlarged according to a preset magnification factor. When the “live area reduce” button is  clicked, the live area is reduced according to a preset minification factor. During the process of enlarging or reducing, the center of the live area remains unchanged. Or in other examples of the present disclosure, the adjustment for the size of the live area may also be triggered in other manners, and examples of the present disclosure do not qualify how the adjustment of the live area is triggered.
In block 206, when a live area stop operation is detected, the live broadcast of image data in the live area is stopped, namely, stop capturing image data in the live area, and the live interface is shut off.
During the process of determining the live area, or during the process of live broadcast, when a live area stop operation is detected, for instance, the “live broadcast stop” option on the live interface (as shown in figure 3) is selected, the live broadcast of image data in the live area is stopped, and the live interface is shut off.
In the method for sharing live desktop information provided by examples of the present disclosure, the first terminal may determine the live area on the desktop of the first terminal, only capture image data in the live area during the process of desktop live broadcast, and send captured image data to the video server, so that the video server may process the image data, and provide video data obtained after the processing for the second terminal. Thus the user of the second terminal is only able to watch information in the live area on the desktop of the first terminal, and is not able to watch information outside the live area on the desktop, which protects the privacy. In addition, the user of the second terminal may watch information in which area on the desktop of the first terminal is determined by the live area, and has nothing to do with software windows which is activated on the desktop. Multiple software interfaces in the live area may be broadcasted live in one live broadcast. Thus the switch among software may be live broadcasted to the opposite, so that operation interfaces of different software may be shown at the same time. Therefore, the flexibility is improved, and the application scope is broadened.
Figure 5 is a schematic diagram illustrating a structure of a device for sharing live desktop information according to another example of the present disclosure. As shown in figure 5, the device may include a determining module 501, a capturing module 502 and a sending module 503.
The determining module 501 is configured to determine a live area on the desktop of a first terminal. The live area is one part of the desktop.
The capturing module 502 is configured to capture image data in the live area during the process of desktop live broadcast.
The sending module 503 is configured to send captured image data to a video server, so that the video server may process the image data, and provide a second terminal with video data obtained after the processing.
In another example, the determining module 501 may include an obtaining unit and a determining unit. The obtaining unit is configured to, when a drag-and-drop operation is detected, take the start position of the drag-and-drop operation on the desktop as a first position, and take the final position of the drag-and-drop operation on the desktop as a second position, obtain a rectangular area of which the diagonal vertexes are the first position and the second position. The determining unit is configured to determine the rectangular area of which the diagonal vertexes are the first position and the second position as the live area.
In another example, the device may further include a displaying module, configured to, during the process of the drag-and-drop operation, according to the movement path of the drag-and-drop operation, display a rectangular box of which the diagonal vertexes are the first position and the current position of the drag-and-drop operation on the desktop.
In another example, the device may further include a first adjusting module, configured to, when a drag-and-drop operation on the live area border is detected, according to the movement path of the drag-and-drop operation on the live area border, adjust the position of the live area, and perform image data capturing according to adjusted live area.
In another example, the first adjusting module may include a first adjusting unit or a second adjusting unit. The first adjusting unit is configured to, every a first present time interval, adjust the position of the live area to the current position of the live area border on the desktop. The second adjusting unit is configured to, every time the distance the live area border is dragged-and-dropped is larger than or equal to a first preset threshold, adjust the position of the live area to the current position of the live area border on the desktop.
In another example, the device may further include a second adjusting module, configured to, when a drag-and-drop operation on any one side or any one vertex of the live area border is detected, according to the movement path of the drag-and-drop operation on the side or vertex of the live area border, adjust the size of the live area, and perform image data capturing according to adjusted live area.
In another example, the second adjusting module may include a third adjusting unit or a fourth adjusting unit. The third adjusting unit is configured to, every a second present time interval, adjust the size of the live area to the current size of the live area border on the desktop. The fourth adjusting unit is configured to, every time the distance the any one side or any one vertex of the live area border is dragged-and-dropped is larger than or equal to a second preset threshold, adjust the size of the live area to the current size of the live area border on the desktop.
Above mentioned all examples may be combined in any manner to generate an optional example of the present disclosure, and no further descriptions will be given hereinafter.
In above examples, the modules and units included in the device are divided according to function logic, and not limited to the above division, as long as the corresponding functions may be achieved. The modules and units in the device may be distributed in the device of the examples according to example descriptions, or may change correspondingly to locate in one or more devices different from the examples. The modules and units in above examples may be merged into one module, or may be divided into multiple sub-modules furthermore. In addition, names of the units are only used to easily distinguish from each other, and not intended to limit the scope of the present disclosure.
Above mentioned examples of device for sharing live desktop information and examples of method for sharing live desktop information belong to the same idea, the specific implementation may refer to the examples of the method, and no further descriptions will be provided hereinafter.
The above examples may be implemented by hardware, software, firmware, or a combination thereof. For example the various methods, processes and functional modules  described herein may be implemented by a processor (the term processor is to be interpreted broadly to include a CPU, processing unit/module, ASIC, logic module, or programmable gate array, etc. ) . The processes, methods and functional modules may all be performed by a single processor or split between several processors; reference in this disclosure or the claims to a ‘processor’ s hould thus be interpreted to mean ‘one or more processors’ . The processes, methods and functional modules are implemented as machine readable instructions executable by one or more processors, hardware logic circuitry of the one or more processors or a combination thereof. The modules, if mentioned in the aforesaid examples, may be combined into one module or further divided into a plurality of sub-modules. Further, the examples disclosed herein may be implemented in the form of a software product. The computer software product is stored in a non-transitory storage medium and comprises a plurality of instructions for making an electronic device implement the method recited in the examples of the present disclosure. The non-transitory storage medium includes a hard disk, a floppy disk, a magnetic disk, a compact disk (e.g., CD-ROM, CD-R, CD-RW, DVD-ROM, DVD-RAM, DVD-RW and DVD+RW) , a tape, a Flash card, ROM, and so on. Optionally, it is possible to download the program codes from a server computer via a communication network.
For example, figure 6 is a schematic diagram illustrating another structure of the device for sharing live desktop information according to an example of the present disclosure. As shown in figure 6, the device may include a memory 61 and a processor 62 in communication with the memory 61.
The memory 61 may store a group of instructions which may be executed by the processor 62 to implement the operations of modules and units of any one of the devices mentioned above.
The foregoing description, for purpose of explanation, has been described with reference to specific examples. However, the illustrative discussions above are not intended to be exhaustive or to limit the present disclosure to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The examples were chosen and described in order to best explain the principles of the present disclosure and its practical applications, to thereby enable others skilled in the art to best utilize the present  disclosure and various examples with various modifications as are suited to the particular use contemplated.

Claims (15)

  1. A method for sharing live desktop information, comprising:
    determining a live area on the desktop of a first terminal; the live area is one part of the desktop;
    capturing image data in the live area during the process of desktop live broadcast; and
    sending, by the first terminal, captured image data to a video server, so that the video server processes the image data, and provides a second terminal with video data obtained after the processing.
  2. The method according to claim 1, wherein determining a live area on the desktop of a first terminal comprises:
    when detecting a drag-and-drop operation, taking, by the first terminal, the start position of the drag-and-drop operation on the desktop as a first position, taking, by the first terminal, the final position of the drag-and-drop operation on the desktop as a second position, and obtaining, by the first terminal, a rectangular area of which the diagonal vertexes are the first position and the second position; and
    determining, by the first terminal, the rectangular area of which the diagonal vertexes are the first position and the second position as the live area.
  3. The method according to claim 2, further comprising:
    during the process of the drag-and-drop operation, according to the movement path of the drag-and-drop operation, displaying a rectangular box of which the diagonal vertexes are the first position and the current position of the drag-and-drop operation on the desktop.
  4. The method according to claim 1, wherein, after determining the live area on the desktop of the first terminal, further comprising:
    when detecting a drag-and-drop operation on the live area border, according to the movement path of the drag-and-drop operation on the live area border,  adjusting the position of the live area, and performing image data capturing according to adjusted live area.
  5. The method according to claim 4, wherein, according to the movement path of the drag-and-drop operation on the live area border, adjusting the position of the live area, comprises:
    every a first present time interval, adjusting the position of the live area to the current position of the live area border on the desktop; or
    every time the distance the live area border is dragged-and-dropped is larger than or equal to a first preset threshold, adjusting the position of the live area to the current position of the live area border on the desktop.
  6. The method according to claim 1, wherein, after determining the live area on the desktop of the first terminal, further comprising:
    when detecting a drag-and-drop operation on any one side or any one vertex of the live area border, according to the movement path of the drag-and-drop operation on the side or vertex of the live area border, adjusting the size of the live area, and performing image data capturing according to adjusted live area.
  7. The method according to claim 6, wherein, according to the movement path of the drag-and-drop operation on the side or vertex of the live area border, adjusting the size of the live area, comprises:
    every a second present time interval, adjusting the size of the live area to the current size of the live area border on the desktop; or
    every time the distance the any one side or any one vertex of the live area border is dragged-and-dropped is larger than or equal to a second preset threshold, adjusting the size of the live area to the current size of the live area border on the desktop.
  8. A device for sharing live desktop information, comprising:
    a determining module, configured to determine a live area on the desktop of a first terminal; the live area is one part of the desktop;
    a capturing module, configured to capture image data in the live area during the process of desktop live broadcast; and
    a sending module, configured to send captured image data to a video server, so that the video server processes the image data, and provides a second terminal with video data obtained after the processing.
  9. The device according to claim 8, the determining module comprises:
    an obtaining unit, configured to, when a drag-and-drop operation is detected, take the start position of the drag-and-drop operation on the desktop as a first position, and take the final position of the drag-and-drop operation on the desktop as a second position, obtain a rectangular area of which the diagonal vertexes are the first position and the second position; and
    a determining unit, configured to determine the rectangular area of which the diagonal vertexes are the first position and the second position as the live area.
  10. The device according to claim 9, further comprising:
    a displaying module, configured to, during the process of the drag-and-drop operation, according to the movement path of the drag-and-drop operation, display a rectangular box of which the diagonal vertexes are the first position and the current position of the drag-and-drop operation on the desktop.
  11. The device according to claim 8, further comprising:
    a first adjusting module, configured to, when a drag-and-drop operation on the live area border is detected, according to the movement path of the drag-and-drop operation on the live area border, adjust the position of the live area, and perform image data capturing according to adjusted live area.
  12. The device according to claim 11, wherein the first adjusting module comprises:
    a first adjusting unit, configured to, every a first present time interval, adjust the position of the live area to current position of the live area border on the desktop;
    a second adjusting unit, configured to, every time the distance the live area border is dragged-and-dropped is larger than or equal to a first preset threshold, adjust the position of the live area to current position of the live area border on the desktop.
  13. The device according to claim 8, further comprising:
    a second adjusting module, configured to, when a drag-and-drop operation on any one side or any one vertex of the live area border is detected, according to the movement path of the drag-and-drop operation on the side or vertex of the live area border, adjust the size of the live area, and perform image data capturing according to adjusted live area.
  14. The device according to claim 13, wherein the second adjusting module comprises:
    a third adjusting unit, configured to, every a second present time interval, adjust the size of the live area to current size of the live area border on the desktop; and
    a fourth adjusting unit, configured to, every time the distance the any one side or any one vertex of the live area border is dragged-and-dropped is larger than or equal to a second preset threshold, adjust the size of the live area to current size of the live area border on the desktop.
  15. A device for sharing live desktop information, comprising: a memory and a processor in communication with the memory;
    the memory stores a group of instructions which may be executed by the processor to implement the operations of modules and/or units of the device according to any one of claims 8 to 14.
PCT/CN2014/090555 2013-11-08 2014-11-07 Method and device for sharing live desktop information WO2015067207A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201310554999.XA CN103595715B (en) 2013-11-08 2013-11-08 Information sharing method and device for desktop live broadcasting
CN201310554999.X 2013-11-08

Publications (1)

Publication Number Publication Date
WO2015067207A1 true WO2015067207A1 (en) 2015-05-14

Family

ID=50085697

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2014/090555 WO2015067207A1 (en) 2013-11-08 2014-11-07 Method and device for sharing live desktop information

Country Status (2)

Country Link
CN (1) CN103595715B (en)
WO (1) WO2015067207A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4008417A4 (en) * 2020-01-20 2022-12-21 Tencent Technology (Shenzhen) Company Limited Image switching method and apparatus, and device and medium

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103595715B (en) * 2013-11-08 2017-02-15 腾讯科技(成都)有限公司 Information sharing method and device for desktop live broadcasting
CN104394437B (en) * 2014-12-09 2018-01-12 广州华多网络科技有限公司 A kind of online live method and system that start broadcasting
CN105338261B (en) * 2015-11-02 2018-09-25 天脉聚源(北京)教育科技有限公司 A kind of method and device of transmission picture relevant information
CN105791885A (en) * 2016-03-31 2016-07-20 成都西可科技有限公司 Method of initiating video live broadcast with one click on motion camera
CN108173944A (en) * 2017-12-29 2018-06-15 北京奇艺世纪科技有限公司 A kind of virtual window sharing method and system
CN113709577B (en) * 2020-05-21 2023-05-23 腾讯科技(深圳)有限公司 Video session method
CN112256169B (en) * 2020-10-14 2021-08-10 北京达佳互联信息技术有限公司 Content display method and device, electronic equipment and storage medium
CN112911196A (en) * 2021-01-15 2021-06-04 随锐科技集团股份有限公司 Multi-lens collected video image processing method and system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101883140A (en) * 2010-06-13 2010-11-10 北京北大众志微系统科技有限责任公司 Coding system and method based on remote display as well as server
CN101888519A (en) * 2009-05-14 2010-11-17 华为技术有限公司 Method for sharing desktop contents and intelligent equipment
CN101984402A (en) * 2010-11-16 2011-03-09 广东威创视讯科技股份有限公司 Image acquisition and compression method and related device
CN102143156A (en) * 2010-12-31 2011-08-03 华为技术有限公司 Desktop sharing method and device
CN102724138A (en) * 2012-06-28 2012-10-10 奇智软件(北京)有限公司 Information sharing method and device for instant messaging
CN103595715A (en) * 2013-11-08 2014-02-19 腾讯科技(成都)有限公司 Information sharing method and device for desktop live broadcasting

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4176122B2 (en) * 2006-10-24 2008-11-05 株式会社東芝 Server terminal, screen sharing method and program
US8407605B2 (en) * 2009-04-03 2013-03-26 Social Communications Company Application sharing
JP5137641B2 (en) * 2008-03-19 2013-02-06 キヤノン株式会社 Information processing apparatus, image processing system, image processing method, and program
CN101370115A (en) * 2008-10-20 2009-02-18 深圳华为通信技术有限公司 Conference terminal, conference server, conference system and data processing method
CN101494547A (en) * 2009-03-05 2009-07-29 广东威创视讯科技股份有限公司 Method and system for implementing conference combining local conference and network conference equipment
CN102883135B (en) * 2012-11-01 2015-08-26 成都飞视美视频技术有限公司 Screen sharing and control method
CN102883134B (en) * 2012-11-01 2015-04-22 成都飞视美视频技术有限公司 Screen sharing and controlling method for video conference system
CN103345506A (en) * 2013-07-03 2013-10-09 云南电网公司 Rapid enterprise-level knowledge-gathering tool based on desktop terminal

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101888519A (en) * 2009-05-14 2010-11-17 华为技术有限公司 Method for sharing desktop contents and intelligent equipment
CN101883140A (en) * 2010-06-13 2010-11-10 北京北大众志微系统科技有限责任公司 Coding system and method based on remote display as well as server
CN101984402A (en) * 2010-11-16 2011-03-09 广东威创视讯科技股份有限公司 Image acquisition and compression method and related device
CN102143156A (en) * 2010-12-31 2011-08-03 华为技术有限公司 Desktop sharing method and device
CN102724138A (en) * 2012-06-28 2012-10-10 奇智软件(北京)有限公司 Information sharing method and device for instant messaging
CN103595715A (en) * 2013-11-08 2014-02-19 腾讯科技(成都)有限公司 Information sharing method and device for desktop live broadcasting

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4008417A4 (en) * 2020-01-20 2022-12-21 Tencent Technology (Shenzhen) Company Limited Image switching method and apparatus, and device and medium

Also Published As

Publication number Publication date
CN103595715A (en) 2014-02-19
CN103595715B (en) 2017-02-15

Similar Documents

Publication Publication Date Title
WO2015067207A1 (en) Method and device for sharing live desktop information
US10235574B2 (en) Image-capturing device, recording device, and video output control device
EP3182716A1 (en) Method and device for video display
CN111866423B (en) Screen recording method for electronic terminal and corresponding equipment
US9715751B2 (en) Zooming to faces depicted in images
WO2017092360A1 (en) Interaction method and device used when multimedia is playing
US10535324B2 (en) Display device and display method thereof
CN106940621B (en) Picture processing method and device
US20130016128A1 (en) Tiled Zoom of Multiple Digital Image Portions
CN103942001A (en) Free screen capture method of mobile terminal and mobile terminal
WO2017045591A1 (en) Method and device for displaying push information
CN106156240B (en) Information processing method, information processing device and user equipment
CN113010136B (en) Method and system for intelligently amplifying shared desktop and readable storage medium
CN106843700B (en) Screenshot method and device
US20150154840A1 (en) System and method for managing video analytics results
JP2016177614A (en) Conference system, information processing device, information terminal, and program
WO2017054142A1 (en) Video data acquisition method, apparatus and system, and computer readable storage medium
US9626580B2 (en) Defining region for motion detection
CN106648281B (en) Screenshot method and device
CN111050204A (en) Video clipping method and device, electronic equipment and storage medium
US20190155462A1 (en) Method for Viewing Application Program, Graphical User Interface, and Terminal
WO2017113713A1 (en) Method and device for adjusting display interface
CN110889057A (en) Business data visualization method and business object visualization device
CN111064930B (en) Split screen display method, display terminal and storage device
CN110569097B (en) Information display method and device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14860459

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 10/10/2016)

122 Ep: pct application non-entry in european phase

Ref document number: 14860459

Country of ref document: EP

Kind code of ref document: A1