CN110457522B - Information sharing method and device, terminal and storage medium - Google Patents

Information sharing method and device, terminal and storage medium Download PDF

Info

Publication number
CN110457522B
CN110457522B CN201910755806.4A CN201910755806A CN110457522B CN 110457522 B CN110457522 B CN 110457522B CN 201910755806 A CN201910755806 A CN 201910755806A CN 110457522 B CN110457522 B CN 110457522B
Authority
CN
China
Prior art keywords
timestamp
video
window
information
receiving
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910755806.4A
Other languages
Chinese (zh)
Other versions
CN110457522A (en
Inventor
方迟
朱海舟
王笑
齐帅东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing ByteDance Network Technology Co Ltd
Original Assignee
Beijing ByteDance Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing ByteDance Network Technology Co Ltd filed Critical Beijing ByteDance Network Technology Co Ltd
Priority to CN201910755806.4A priority Critical patent/CN110457522B/en
Publication of CN110457522A publication Critical patent/CN110457522A/en
Application granted granted Critical
Publication of CN110457522B publication Critical patent/CN110457522B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • G06F16/71Indexing; Data structures therefor; Storage structures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • G06F16/74Browsing; Visualisation therefor
    • G06F16/745Browsing; Visualisation therefor the internal structure of a single video sequence

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The present disclosure provides a method and apparatus for information sharing without interruption of playback, a terminal, and a storage medium. The information sharing method without interrupting the playing comprises the following steps: receiving instruction information, and recording a timestamp of receiving the instruction information; extracting video frame information corresponding to the time stamp from the video file which is played uninterruptedly according to the time stamp; and sharing the information to a designated location. The information sharing method without the interruption of playing can keep the simultaneous operation of the windows when the plurality of windows are loaded, select and send the content to be shared under the condition that the video is not interrupted, and does not influence the watching experience.

Description

Information sharing method and device, terminal and storage medium
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a method and an apparatus for information sharing, a terminal, and a storage medium.
Background
The operation of the current video screenshot often needs to pause the video in the playing process, and sometimes the user does not want to stop the video playing process. Therefore, such operation brings additional time cost and operation cost to the user.
Disclosure of Invention
In order to solve the existing problems, the present disclosure provides an information sharing method and apparatus, a terminal, and a storage medium.
The present disclosure adopts the following technical solutions.
In some embodiments, the present disclosure provides a method of information sharing, comprising:
receiving instruction information, and recording a timestamp of receiving the instruction information;
extracting video frame information corresponding to the time stamp from the video file which is played uninterruptedly according to the time stamp; and
and sharing the information to a specified position.
In some embodiments, the present disclosure provides an information sharing apparatus, comprising:
the receiving module is used for receiving instruction information;
the time module is used for acquiring a timestamp for receiving the instruction information;
the extraction module is used for extracting information from the video file which is not interrupted to be played according to the timestamp; and
and the sending module is used for sending the information to the specified position.
In some embodiments, the present disclosure provides a terminal comprising: at least one memory and at least one processor;
the memory is used for storing program codes, and the processor is used for calling the program codes stored in the memory to execute the method.
In some embodiments, the present disclosure provides a storage medium for storing program code for performing the above-described method.
The method and the device for sharing the information without interrupting the playing, the terminal and the storage medium can support screenshot sharing while video playing is not interrupted, and watching continuity is not affected.
Drawings
The above and other features, advantages, and aspects of embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements. It should be understood that the drawings are schematic and that elements and components are not necessarily drawn to scale.
Fig. 1 is a flowchart of an information sharing method without interrupting playback according to an embodiment of the present disclosure.
FIG. 2 is a flow chart of a multi-window parallel method of an embodiment of the present disclosure.
FIG. 3 is a flowchart of a method of loading an activation window of an embodiment of the present disclosure.
FIG. 4 is a flowchart of a method of loading an activation window according to yet another embodiment of the present disclosure.
Fig. 5 is a schematic diagram of a focus label transfer process of an embodiment of the present disclosure.
FIG. 6 is a schematic diagram of a multi-window arrangement of an embodiment of the present disclosure.
FIG. 7 is a flow chart of a multi-window parallel method of an embodiment of the present disclosure.
Fig. 8 is a schematic diagram of a multi-window arrangement according to yet another embodiment of the present disclosure.
Fig. 9 is a schematic structural diagram of an information sharing apparatus that does not interrupt playing according to an embodiment of the present disclosure.
Fig. 10 is a schematic structural diagram of an information sharing apparatus without interrupting playback according to another embodiment of the present disclosure.
Fig. 11 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
It should be understood that various steps recited in method embodiments of the present disclosure may be performed in parallel and/or in parallel. Moreover, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "including" and variations thereof as used herein is intended to be open-ended, i.e., "including but not limited to". The term "based on" is "based at least in part on". The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments". Relevant definitions for other terms will be given in the following description.
It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a" or "an" in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that reference to "one or more" unless the context clearly dictates otherwise.
The names of messages or information exchanged between devices in the embodiments of the present disclosure are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
The embodiments of the present application will be described in detail below with reference to the accompanying drawings.
As shown in fig. 1, fig. 1 is a flowchart of an information sharing method without interruption of playing according to an embodiment of the present disclosure. The information sharing method without interrupting playing provided by the embodiment of the disclosure comprises the following steps.
S11, receiving the command information, and recording a timestamp of the receipt of the command information.
Specifically, first, the user issues instruction information that a screenshot is required or a video is intercepted. The instruction information can be sent out in two forms, namely a screenshot instruction or a video intercepting instruction. If the user needs to capture the screenshot, only one instruction or a plurality of discontinuous instructions can be issued. And intercepting or extracting one or more frames of video pictures in response to the one or more discontinuous instructions. On the other hand, if the user needs to intercept a segment of video file, two instructions need to be sent, which correspond to the start point and the end point of the intercepted video file respectively.
And S12, extracting information from the video file which is played without interruption according to the time stamp.
Specifically, one time stamp corresponds to only one video picture frame, and therefore, accurate positioning can be performed according to the time stamp. And intercepting or extracting the video image frame appointed to be intercepted by the user according to the positioning of the time stamp. If the instruction is a screenshot instruction, returning to the independent video image frame; if the video command is intercepted, returning the selected multiple frames of continuous video images according to the positioning of the starting point timestamp and the ending point timestamp.
In actual operation, a user may not issue an instruction at a precise time, but issue an instruction requiring screenshot after the video continues to play for a period of time, that is, there is a delay. Alternatively, the user changes the range to be intercepted after continuing to watch the video. At this time, the video image frame obtained by positioning needs to be displayed to the user, and subsequent operation is performed after the user confirms the video image frame; or, the user gives the repositioning instruction information according to the result. However, the time axis of the video cannot be adjusted back without interrupting the video. For this case, a value range may be set for the timestamp, that is, a segment of video image frames before and after the corresponding timestamp may be simultaneously extracted and provided to the user for selection.
And recording the replacement time stamp according to the selection of the user, and memorizing the difference value of the replacement time stamp and the initial time stamp for being used as a priority replacement selection when a relocation request is received next time. If relocation is required at the time of video interception, one or both of the two timestamps may be involved. Value ranges are set for the start timestamp and the end timestamp respectively, and a user can choose to replace the start timestamp or the end timestamp respectively or replace the start timestamp and the end timestamp simultaneously. For example, a value field is set for a start timestamp, a relocation request is received, a video image frame in the value field of the set start timestamp is provided for a user, relocation is performed according to selection of the user, a new start timestamp is obtained, and a result is generated according to the new start timestamp. Other procedures such as relocating the end timestamp are similar and will not be described further herein.
In addition, whether the intercepted result is returned to the user or an alternative is provided to the user, the display window is different from the video playing window, and the two windows can run operation simultaneously.
The implementation of the foregoing embodiment relates to a scheme for displaying and operating multiple windows in parallel, which is not limited in the embodiment of the present disclosure, and for example, the following scheme may be adopted:
as shown in fig. 2, the multi-window parallel method provided by the embodiment of the present disclosure includes:
s100, loading a first window, and setting the first window to be in an activated state.
Specifically, the disclosed embodiment may include a start path of a program, an identifier of a display window of the program, and a display area of the display window. The starting path of the program is, for example, an installation path or a desktop shortcut of the program on the running terminal. The identification of the display window of the program may be the name or other identification of the display window. The display area of the display window may belong to a screen map area.
It should be noted that, when a window is created by a terminal of the android operating system, the window is created by creating an Activity component, and when the Activity component is created by the terminal, a default manner is to transmit parameters of full-screen display into the created Activity component, so that the created window is also displayed in a full screen. However, in the embodiment of the invention, when the first window is created, the preset display parameters of the first window are transmitted to the created Activity component, so that the created first window can be displayed in the screen in a windowing way.
S200, loading N second windows, wherein N is an integer not less than 1, and setting the N second windows to be in an activated state.
Specifically, the loading step described above may be the same as the loading step in S100. After the first window is loaded and activated, the plurality of second windows are continuously loaded in sequence, and because the display parameters of the windows are different, the screen mapping areas occupied by the windows are different, and the windows can be simultaneously operated in the same screen mapping area. In particular, when the terminal detects a new window creation instruction, it will typically transfer the focus label to the newly created window, i.e., perform focus transfer. The detection focus label belongs to the concept of a master task and a slave task on the terminal. Wherein, the window designated as the focus is the primary task, and the primary task can perform various operations, such as dragging, playing, and the like. A slave task may refer to a window that has been created but not operating, at which time the window of the slave task is typically in a paused or frozen state because it is not designated as a focus window. In order to solve the problem of suspension or freezing of the slave task window, the embodiment of the present disclosure proposes that when the focus of an application is switched, the window with the focus being transferred is not notified, the application state is not changed, and the window continues to run. For example, the current focus window is a video window, and the focus tab is transferred to the chat page window after the user newly creates the chat page window. It should be noted that, while the chat page window obtains the focus tab, the video window still keeps the playing status due to not receiving the notification of focus transition, and does not become a pause or freeze status due to losing focus.
The above scheme can be expressed as: (1) acquiring a starting path of a program and starting a window; (2) detecting windows of the started program, generally all valid windows of the program; (3) all detected windows are set as active windows.
As shown in fig. 3, fig. 3 is a flowchart of a method for loading an activation window according to an embodiment of the present disclosure. In step S100, loading the active first window may include the following steps.
S101, obtaining a first window triggering request, responding to the first window triggering request, and loading the first window.
S102, simultaneously sending a focus label to the first window.
More specifically, when there is no application task on the current terminal, the user selects an application to start, the terminal opens the application, creates a window and displays the window to the user, and simultaneously sends a focus tab to the newly created window. It is understood that the terminal may not acquire the data information when the focus window is not detected.
As shown in fig. 4, fig. 4 is a flowchart of a method for loading an activation window according to another embodiment of the present disclosure. In step S200, loading and activating the plurality of second windows may further include the following steps.
S201, obtaining a second window triggering request, responding to the second window triggering request, and loading the second window.
S202, receiving a focus transfer request, and transferring the focus label from the first window to the second window based on the focus transfer request, while not notifying the first window that the focus label has been transferred, so that the first window is still in an activated state.
More specifically, at least one second window may also be created while the first window is running. And responding to the plurality of window triggering requests, and correspondingly and respectively loading the new windows. For example, as shown in FIG. 5, while current window 1 is running, window 2 is created, while the focus tab is transferred from window 1 to window 2. At this point, if window 1 is notified of the loss of the focus tab, window 1 may enter a paused or frozen state. However, if only the focus label is transferred without notifying window 1, window 1 also continues to run in a state where the focus label is not transferred, i.e., the continued running of window 1 is not affected. On the other hand, window 2 receives the focus label, and window 2 operates as a normal focus window. At this time, both the window 1 and the window 2 are in a state of having a focus label, and are operated at the same time. Then, when more windows are created to the window N, the focus label is transferred according to the creation sequence of the windows, and the window with the focus label removed in advance does not receive the focus transfer notice and still runs in the focus mode, so that the windows 1-N in the same screen mapping area are in an operable state at the same time, and the window utilization rate is improved.
In this embodiment, on one hand, a frame of image or a segment of video is captured, and the playing of the video is not interrupted. When the video image frame is intercepted/extracted, the video window does not receive the notice of focus transfer, and still operates according to the mode with focus. On the other hand, the captured video image frame/video is displayed in the form of a new window, and is in a running state due to the distribution of the focus coordinate. Therefore, the user can obtain the effect of not interrupting the video playing when the image/video is intercepted.
In addition, the windows are arranged in the screen mapping area, and the windows can be partially overlapped with each other or not overlapped with each other. Further, the arrangement of the windows may be determined by display parameters. In this embodiment of the present disclosure, for example, the display parameter of the first window may include position information and size information of the first window, and specifically may include an X coordinate of a first pixel at an upper left corner of the first window on the screen mapping area, a Y coordinate of the first pixel at the upper left corner of the first window on the terminal screen, a length of the first window, and a width of the first window. The terminal can display the first window on the terminal screen mapping area according to the four display parameters. The display area of the display window may be represented by a display start coordinate and a window length and width, and the display area of the display window is set, for example, represented by the display start coordinate and the window length and width, where the size of the display area is generally smaller than or equal to the screen mapping area size of the display screen to be operated, i.e., located in the screen mapping area. The window of the program may then be displayed in a display area within the screen mapping area in which it is run. It will be appreciated that the display area may occupy only a portion of the screen map, while other portions may display other content, such as a running window for other programs.
Therefore, the user can simultaneously operate a plurality of windows, and the information processing efficiency of the user is improved.
Because there is no operation priority distinction between multiple windows, the user can also control multiple windows at the same time. In the embodiment of the disclosure, the terminal receives and executes the operation instructions respectively aiming at the first window and the second window at the same time. Specifically, the operation instruction may be an operation instruction for modifying a display parameter of the window. For example, the terminal receives position information and/or size information in display parameters of a window, and the like, which are changed by dragging the window on the screen with a mouse or a finger, and is not specifically limited in this embodiment.
In addition, in the embodiment of the present disclosure, the window manager may also be utilized to determine the size of the respective display areas of the plurality of windows according to the size and the number of displayable areas available for displaying the plurality of windows on the screen map. Accordingly, the situation that the arrangement of the windows is unreasonable to influence the operation of the user can be avoided.
FIG. 6 is a schematic diagram of a multi-window arrangement of an embodiment of the present disclosure. As shown in fig. 6, in a display area of the screen map area that can be used to display a plurality of windows, 4 windows can be simultaneously created, and the 4 windows can be displayed in the same size and in a tiled manner. From the window 1 located at the upper layer to the window 4 located at the lower layer, are in the activated state. In this embodiment, when a user operates one of the windows, for example, the window 4 (for example, mouse click, touch screen click, drag, keyboard operation, etc.), the terminal receives a signal of the operation, confirms which window the operation is directed to, and adjusts the window 4 to the uppermost layer for display in response to the signal, and the window 1 may run on the second layer. In another embodiment of the present disclosure, the terminal may also receive operations on two of the windows at the same time. For example, a user drags a window 1 and a window 2 at the same time, the terminal correspondingly receives a first operation signal and a second operation signal, coordinates of the window 1 and the window 2 in a display area are changed in response to the first operation signal and the second operation signal, and the position relation of layers of the window 1 and the window 2 is positively correlated with the signal generation sequence. For example, the user first drags the window 1, at this time, the window 1 is located at the uppermost layer, then the user selects the drag window 2 while the window 1 is dragged, the window 2 will be located at the uppermost layer, and the window 1 is dropped to the second layer for display.
Fig. 7 is a schematic diagram of a multi-window arrangement of another embodiment of the present disclosure. As shown in fig. 7, the plurality of windows are not overlapped with each other and are located in the same layer. The arrangement ensures that the windows are not influenced mutually, and can obtain better operation experience. It is understood that the windows in this embodiment may also be dragged or otherwise manipulated at the same time, and are not repeated here. In addition, although fig. 7 illustrates that 4 windows are displayed in the same size, the embodiments of the present disclosure are not limited thereto and may be displayed in different sizes.
Another embodiment of the present disclosure further provides a method for multi-window parallel, as shown in fig. 8, including:
s1001, loading a first window, creating a first channel, binding the first window and the first channel, and transmitting information to the first window through the first channel.
S1002, loading an Nth window, wherein N is an integer greater than 1, creating N Nth channels, binding one of the Nth channels with the Nth window, and transmitting information to the Nth window through one of the Nth channels; and unbinding (N-1) windows and (N-1) th channels, binding the (N-1) windows and (N-1) th channels, respectively, and transmitting information to the (N-1) windows through the (N-1) th channels, respectively.
In particular, the above steps describe the allocation of instruction channels in multi-window parallel operation. Specifically, for example, first, a window 1 is created, a first channel is created for the window 1, the window 1 is bound with the first channel, and at this time, an instruction is sent to the window 1 through the first channel; window 2 is then created and two second channels are created. One of the second channels is bound with the window 2 and sends an instruction to the window 2; then, the window 1 and the first channel are unbound, and the window 1 and another second channel are bound, that is, at this time, the two second channels respectively send instructions to the window 1 and the window 2, and perform operations at the same time. When more and more windows are loaded, the distribution channel is updated accordingly, and the user can realize the simultaneous operation of dragging the window 1 and the window 2. For example, the terminal receives the first operation signal and the second operation signal accordingly, and simultaneously changes the coordinates of the window 1 and the window 2 in the display area in response to the two signals. Specifically, the layer position relationship of the window 1 and the window 2 positively correlates with the signal generation order. For example, the user first drags the window 1, at this time, the window 1 is located at the uppermost layer, then the user selects the drag window 2 while the window 1 is dragged, the window 2 will be located at the uppermost layer, and the window 1 is dropped to the second layer for display.
And S13, sharing the information to a specified position.
In step S12, if the user has not issued a positioning request for the returned result, the returned video image frame or video is directly sent to the specified location. If the user has relocated, the information sent is the relocated video image frame or video, and the result of the last transmission can also be the video image frame or video. The step of sharing is, for example, sending the information with the address of the client of the other party to the server, and of course, in this embodiment of the application, other ways of sharing the information may also be adopted, and there is no specific limitation here.
The embodiment of the present disclosure further provides an information sharing apparatus 10 without interrupting the playing, as shown in fig. 9, including a receiving module 30, a time module 50, an extracting module 70, and a sending module 90. The receiving module 30 may be configured to receive instruction information. Time module 50 may be used to obtain a timestamp of the receipt of the instructional information. The extraction module 70 may be configured to extract information from the video file that is not playing back according to the timestamp. The sending module 90 may be used to send the information to a specified location. In addition, the receiving module 30 can also be used for receiving positioning requests and other commands and requests issued by users.
In another embodiment of the present disclosure, the device 10 for information sharing without interrupting playing may further include a setting module 20, as shown in fig. 10, the setting module 20 may be configured to set a value field of the timestamp.
For the embodiments of the apparatus, since they correspond substantially to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described apparatus embodiments are merely illustrative, wherein the modules described as separate modules may or may not be separate. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
The multi-window parallel method and apparatus of the present disclosure are described above based on the embodiments and application examples. In addition, the present disclosure also provides a terminal and a storage medium, which are explained below.
Referring now to fig. 11, a schematic diagram of an electronic device (e.g., a terminal device or server) 800 suitable for use in implementing embodiments of the present disclosure is shown. The terminal device in the embodiments of the present disclosure may include, but is not limited to, a mobile terminal such as a mobile phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a vehicle terminal (e.g., a car navigation terminal), and the like, and a stationary terminal such as a digital TV, a desktop computer, and the like. The electronic device shown in fig. 11 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 11, electronic device 800 may include a processing means (e.g., central processing unit, graphics processor, etc.) 801 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)802 or a program loaded from a storage means 808 into a Random Access Memory (RAM) 803. In the RAM803, various programs and data necessary for the operation of the electronic apparatus 800 are also stored. The processing apparatus 801, the ROM 802, and the RAM803 are connected to each other by a bus 804. An input/output (I/O) interface 805 is also connected to bus 804.
Generally, the following devices may be connected to the I/O interface 805: input devices 806 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; output devices 807 including, for example, a Liquid Crystal Display (LCD), speakers, vibrators, and the like; storage 808 including, for example, magnetic tape, hard disk, etc.; and a communication device 809. The communication means 809 may allow the electronic device 800 to communicate wirelessly or by wire with other devices to exchange data. While fig. 11 illustrates an electronic device 800 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may be alternatively implemented or provided.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication means 809, or installed from the storage means 808, or installed from the ROM 802. The computer program, when executed by the processing apparatus 801, performs the above-described functions defined in the methods of the embodiments of the present disclosure.
It should be noted that the computer readable medium in the present disclosure can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network Protocol, such as HTTP (hypertext transfer Protocol), and may interconnect with any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The computer readable medium may be embodied in the electronic device; or may be separate and not incorporated into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to perform the methods of the present disclosure as described above.
Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. Where the name of an element does not in some cases constitute a limitation on the element itself.
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), systems on a chip (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
According to one or more embodiments of the present disclosure, there is provided an information sharing method, including:
receiving instruction information, and recording a timestamp of receiving the instruction information;
extracting video frame information corresponding to the time stamp from the video file which is played uninterruptedly according to the time stamp; and
and sharing the information to a specified position.
According to one or more embodiments of the present disclosure, there is provided an information sharing method, where the receiving instruction information, and the recording a timestamp of the received instruction information includes:
receiving first instruction information, and acquiring a first timestamp corresponding to the video file which is not played continuously and received the first instruction information in response to the first instruction information.
According to one or more embodiments of the present disclosure, there is provided an information sharing method, where the extracting, according to the timestamp, video frame information corresponding to the timestamp from a video file that is played uninterruptedly includes:
acquiring a corresponding first video image frame from the video file which is played uninterruptedly according to the first timestamp; and setting the value range of the first timestamp, and acquiring a corresponding first continuous video image frame from the video file which is played uninterruptedly according to the value range of the first timestamp.
According to one or more embodiments of the present disclosure, there is provided an information sharing method, after the step of extracting video frame information corresponding to a time stamp from a video file that is played uninterruptedly according to the time stamp, the method further includes:
outputting the first video image frame.
According to one or more embodiments of the present disclosure, there is provided an information sharing method, wherein the method further includes:
outputting the first continuous video image frame, receiving a first positioning instruction, and recording a second timestamp of receiving the first positioning instruction; and
and acquiring a corresponding second video image frame from the video file which is played uninterruptedly according to the second timestamp.
According to one or more embodiments of the present disclosure, there is provided an information sharing method, wherein the information includes the second video image frame.
According to one or more embodiments of the present disclosure, there is provided an information sharing method, where the receiving instruction information, and the recording a timestamp of the received instruction information includes:
receiving second instruction information, and acquiring a third timestamp, corresponding to the video file which is not played continuously, of the received second instruction information in response to the second instruction information;
receiving third instruction information, and acquiring a fourth timestamp corresponding to the video file which is not continuously played after receiving the third instruction information in response to the third instruction information; and
and acquiring an intermediate timestamp between the third timestamp and the fourth timestamp.
According to one or more embodiments of the present disclosure, there is provided a method for information sharing without interruption of playing, where the step of extracting corresponding information from a video file that is not interrupted to be played according to a timestamp includes:
acquiring continuous first video image frames from the video file which is played uninterruptedly according to the third timestamp, the fourth timestamp and the middle timestamp; respectively setting the value range of the third timestamp and the value range of the fourth timestamp, acquiring a corresponding second continuous video image frame from the video file which is not played continuously according to the value range of the third timestamp, and acquiring a corresponding third continuous video image frame from the video file which is not played continuously according to the value range of the fourth timestamp.
According to one or more embodiments of the present disclosure, there is provided a method for information sharing without interruption of playing, where after the step of extracting corresponding information from a video file that is not interrupted to be played according to the timestamp, the method further includes:
outputting the successive first video image frames.
According to one or more embodiments of the present disclosure, there is provided an information sharing method, wherein the information includes the consecutive first video image frames.
According to one or more embodiments of the present disclosure, there is provided an information sharing method, wherein the method further includes:
for the continuous first video image frames, receiving a second positioning request, outputting the second continuous video image frames, receiving a second positioning instruction, and recording a fifth timestamp of receiving the second positioning instruction; and/or
For the continuous first video image frames, receiving a third positioning request, outputting a third continuous video image frame, receiving a third positioning instruction, and recording a sixth timestamp of receiving the third positioning instruction;
and acquiring continuous third video image frames from the video file which is played uninterruptedly according to the fifth time stamp and/or the sixth time stamp.
According to one or more embodiments of the present disclosure, there is provided an information sharing method, wherein the information includes the consecutive third video image frames.
According to one or more embodiments of the present disclosure, there is provided an information sharing apparatus without interrupting playback, including:
the receiving module is used for receiving instruction information;
the time module is used for acquiring a timestamp for receiving the instruction information;
the extraction module is used for extracting information from the video file which is not interrupted to be played according to the timestamp; and
and the sending module is used for sending the information to a specified position.
According to one or more embodiments of the present disclosure, there is provided an apparatus for information sharing without interrupting playback, the apparatus further including:
and the setting module is used for setting the value range of the timestamp.
According to one or more embodiments of the present disclosure, an apparatus for information sharing without interrupting playing is provided, wherein the receiving module is further configured to receive a location request.
According to one or more embodiments of the present disclosure, there is provided a terminal including: at least one memory and at least one processor;
wherein the at least one memory is configured to store program code, and the at least one processor is configured to call the program code stored in the at least one memory to perform the method of any one of the above.
According to one or more embodiments of the present disclosure, there is provided a storage medium for storing program code for performing the above-described method.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the disclosure herein is not limited to the particular combination of features described above, but also encompasses other combinations of features described above or equivalents thereof without departing from the spirit of the disclosure. For example, the above features and the technical features disclosed in the present disclosure (but not limited to) having similar functions are replaced with each other to form the technical solution.
Further, while operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. Under certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limitations on the scope of the disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Claims (18)

1. A method of information sharing, comprising:
receiving instruction information, and recording a timestamp of the received instruction information;
extracting video frame information corresponding to the time stamp from the video file which is played uninterruptedly according to the time stamp; and
sharing the information to a designated location; wherein, the video file is played without interruption, comprising: when the focus is transferred from the video window, the video window is not informed to keep the playing state of the video window, the video window operates according to the mode with the focus, and the captured video image frame/video operates in the window with the focus coordinate.
2. The method of claim 1, wherein the step of receiving instruction information and recording a timestamp of the received instruction information comprises:
receiving first instruction information, and responding to the first instruction information, acquiring a first timestamp corresponding to the video file which is not continuously played after the first instruction information is received.
3. The method according to claim 2, wherein the step of extracting video frame information corresponding to the time stamp from the video file that is played without interruption according to the time stamp comprises:
acquiring a corresponding first video image frame from the video file which is played uninterruptedly according to the first timestamp; and setting the value range of the first timestamp, and acquiring a corresponding first continuous video image frame from the video file which is played uninterruptedly according to the value range of the first timestamp.
4. The method according to claim 3, further comprising, after the step of extracting video frame information corresponding to the time stamp from the video file that is played uninterrupted according to the time stamp, the step of:
and outputting the first video image frame.
5. The method of claim 4, wherein the information comprises the first video image frame.
6. The method of claim 4, further comprising:
outputting the first continuous video image frame, receiving a first positioning instruction, and recording a second timestamp of the received first positioning instruction; and
and acquiring a corresponding second video image frame from the video file which is played uninterruptedly according to the second timestamp.
7. The method of claim 6, wherein the information comprises the second video image frame.
8. The method of claim 1, wherein the step of receiving instruction information and recording a timestamp of the received instruction information comprises:
receiving second instruction information, and acquiring a third timestamp, corresponding to the video file which is not played continuously, of the received second instruction information in response to the second instruction information;
receiving third instruction information, and acquiring a fourth timestamp, corresponding to the video file which is not continuously played, of the received third instruction information in response to the third instruction information; and
and acquiring an intermediate timestamp between the third timestamp and the fourth timestamp.
9. The method according to claim 8, wherein the step of extracting the corresponding information from the video file that is played without interruption according to the timestamp comprises:
acquiring continuous first video image frames from the video file which is played uninterruptedly according to the third time stamp, the fourth time stamp and the middle time stamp; respectively setting the value range of the third timestamp and the value range of the fourth timestamp, acquiring a corresponding second continuous video image frame from the video file which is not played continuously according to the value range of the third timestamp, and acquiring a corresponding third continuous video image frame from the video file which is not played continuously according to the value range of the fourth timestamp.
10. The method of claim 9, further comprising, after the step of extracting corresponding information from the uninterrupted play video file according to the time stamp:
outputting the successive first video image frames.
11. The method of claim 10, wherein the information comprises the consecutive first video image frames.
12. The method of claim 10, further comprising:
for the continuous first video image frames, receiving a second positioning request, outputting the second continuous video image frames, receiving a second positioning instruction, and recording a fifth timestamp of receiving the second positioning instruction; and/or
For the continuous first video image frames, receiving a third positioning request, outputting the third continuous video image frames, receiving a third positioning instruction, and recording a sixth timestamp of receiving the third positioning instruction;
and acquiring continuous third video image frames from the video file which is played uninterruptedly according to the fifth time stamp and/or the sixth time stamp.
13. The method of claim 12, wherein the information comprises the consecutive third video image frames.
14. An apparatus for information sharing, comprising:
the receiving module is used for receiving instruction information;
the time module is used for acquiring a timestamp for receiving the instruction information;
the extracting module is used for extracting information from the video file which is not interrupted to be played according to the timestamp; and
the sending module is used for sending the information to a specified position; the method for playing the video file without interruption comprises the following steps: when the focus is transferred from the video window, the video window is not informed to keep the playing state of the video window, the video window operates according to the mode with the focus, and the intercepted video image frame/video operates in the window with the focus coordinate.
15. The apparatus of claim 14, further comprising:
and the setting module is used for setting the value range of the timestamp.
16. The apparatus of claim 14, wherein the receiving module is further configured to receive a positioning request.
17. A terminal, comprising:
at least one memory and at least one processor;
wherein the at least one memory is configured to store program code and the at least one processor is configured to invoke the program code stored in the at least one memory to perform the method of any of claims 1 to 13.
18. A storage medium for storing program code for performing the method of any one of claims 1 to 13.
CN201910755806.4A 2019-08-15 2019-08-15 Information sharing method and device, terminal and storage medium Active CN110457522B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910755806.4A CN110457522B (en) 2019-08-15 2019-08-15 Information sharing method and device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910755806.4A CN110457522B (en) 2019-08-15 2019-08-15 Information sharing method and device, terminal and storage medium

Publications (2)

Publication Number Publication Date
CN110457522A CN110457522A (en) 2019-11-15
CN110457522B true CN110457522B (en) 2022-09-06

Family

ID=68486940

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910755806.4A Active CN110457522B (en) 2019-08-15 2019-08-15 Information sharing method and device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN110457522B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112256173A (en) * 2020-10-20 2021-01-22 北京字节跳动网络技术有限公司 Window display method and device of electronic equipment, terminal and storage medium
CN114048048A (en) * 2021-11-15 2022-02-15 Oppo广东移动通信有限公司 Information sharing method and device, electronic equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104461242A (en) * 2014-12-08 2015-03-25 深圳市嘉乐派科技有限公司 Multiwindow interface realization method based on Android operating system
CN106162378A (en) * 2016-06-30 2016-11-23 乐视控股(北京)有限公司 The browsing method of video file and browsing apparatus

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101815199B (en) * 2010-04-07 2013-08-07 中兴通讯股份有限公司 Video processing method and terminal
CN103369372B (en) * 2013-07-17 2017-11-10 广州珠江数码集团股份有限公司 A kind of live telecast screen-cutting system and method
CN103634683A (en) * 2013-11-29 2014-03-12 乐视致新电子科技(天津)有限公司 Screen capturing method and device for intelligent televisions
CN105812892A (en) * 2014-12-29 2016-07-27 深圳Tcl数字技术有限公司 Method, device and system for obtaining screenshot of dynamic display picture of television
CN105828146A (en) * 2016-03-21 2016-08-03 乐视网信息技术(北京)股份有限公司 Video image interception method, terminal and server
CN106412709A (en) * 2016-10-21 2017-02-15 上海与德信息技术有限公司 Video capturing method and video capturing device
CN107229402B (en) * 2017-05-22 2021-08-10 努比亚技术有限公司 Dynamic screen capturing method and device of terminal and readable storage medium
CN107613235B (en) * 2017-09-25 2019-12-27 北京达佳互联信息技术有限公司 Video recording method and device
CN109729420B (en) * 2017-10-27 2021-04-20 腾讯科技(深圳)有限公司 Picture processing method and device, mobile terminal and computer readable storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104461242A (en) * 2014-12-08 2015-03-25 深圳市嘉乐派科技有限公司 Multiwindow interface realization method based on Android operating system
CN106162378A (en) * 2016-06-30 2016-11-23 乐视控股(北京)有限公司 The browsing method of video file and browsing apparatus

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
A Key Frame Selection Algorithm Based on Sliding Window and Image Features;Jigang Cao etc.;《2016 IEEE 22nd International Conference on Parallel and Distributed Systems》;20170119;第956-962页 *
对象轮播技术及其在资讯服务系统中的应用;尹西林;《中国优秀硕士学位论文全文数据库(信息科技辑)》;20080315;第I138-398页 *

Also Published As

Publication number Publication date
CN110457522A (en) 2019-11-15

Similar Documents

Publication Publication Date Title
CN113489937B (en) Video sharing method, device, equipment and medium
CN110457109B (en) Multi-window parallel method and device, terminal and storage medium
CN111787392A (en) Video screen projection method and device, electronic equipment and storage medium
CN112394892A (en) Screen projection method, screen projection equipment, mobile terminal and storage medium
CN109542614B (en) Resource allocation method, device, terminal and storage medium
CN109753332B (en) Method and device for displaying information
CN112242947B (en) Information processing method, device, equipment and medium
WO2023000888A1 (en) Cloud application implementing method and apparatus, electronic device, and storage medium
CN111790148B (en) Information interaction method and device in game scene and computer readable medium
WO2022237744A1 (en) Method and apparatus for presenting video, and device and medium
CN110457522B (en) Information sharing method and device, terminal and storage medium
CN114168018A (en) Data interaction method, data interaction device, electronic equipment, storage medium and program product
CN110401877B (en) Video playing control method and device, electronic equipment and storage medium
CN114390360B (en) Live voting method and device, electronic equipment and storage medium
CN110611847B (en) Video preview method and device, storage medium and electronic equipment
CN115474086B (en) Play control method, device, electronic equipment and storage medium
CN115543509A (en) Display method and device of session list, electronic equipment and storage medium
CN115237530A (en) Information display method and device, electronic equipment and storage medium
CN112312058B (en) Interaction method and device and electronic equipment
CN113127101A (en) Application program control method, device, equipment and medium
CN112243219A (en) Display device, terminal control method and device, terminal and storage medium
CN113010300A (en) Image effect refreshing method and device, electronic equipment and computer readable storage medium
US20240106928A1 (en) Media content sharing method, apparatus, electronic device and storage medium
US20240143649A1 (en) Multimedia information processing method, apparatus, electronic device, and medium
CN114827736B (en) Video playback method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant