CN113721816A - Video processing method and device - Google Patents

Video processing method and device Download PDF

Info

Publication number
CN113721816A
CN113721816A CN202111012297.XA CN202111012297A CN113721816A CN 113721816 A CN113721816 A CN 113721816A CN 202111012297 A CN202111012297 A CN 202111012297A CN 113721816 A CN113721816 A CN 113721816A
Authority
CN
China
Prior art keywords
video
window
input
sub
interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111012297.XA
Other languages
Chinese (zh)
Inventor
程金鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202111012297.XA priority Critical patent/CN113721816A/en
Publication of CN113721816A publication Critical patent/CN113721816A/en
Priority to PCT/CN2022/115453 priority patent/WO2023030234A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B20/00Signal processing not specific to the method of recording or reproducing; Circuits therefor
    • G11B20/10Digital recording or reproducing
    • G11B20/10527Audio or video recording; Data buffering arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72439User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a video processing method and device, and belongs to the technical field of video processing. The method comprises the following steps: receiving a first input for a first interface; the first interface comprises a video playing window; in response to the first input, displaying the first interface update as a first sub-interface and a second sub-interface; the first sub-interface comprises the video playing window, and the second sub-interface comprises a video control window; the video control window is used for storing videos in different video application programs to the video control; receiving a second input for moving the video playback window to the video control window; and responding to the second input, and saving the target video which is playing in the video playing window to the video control.

Description

Video processing method and device
Technical Field
The present application belongs to the field of video processing technologies, and in particular, to a video processing method and apparatus.
Background
With the continuous development of terminal equipment technology, various video playing software and video portal websites are continuously emerging, and video playing becomes an indispensable function for users to use terminal equipment.
With the continuous emergence of video resources, more and more websites and software for watching videos are provided for users, the users can watch various videos on different video portal websites and video playing software, and the users need to store videos which are liked by the users or not watched yet.
In the related art, different video portals and video playing software need to help users to store played videos in respective application programs, if a user searches and continues to watch the stored videos, the user needs to enter a certain specific video portal or video playing software and then search the stored videos, the operation steps are complicated, and an entrance is difficult to find.
Disclosure of Invention
The embodiment of the application aims to provide a video processing method and a video processing device, which can solve the problem that videos stored by users in different video portals and video playing software cannot be uniformly managed and controlled in the prior art.
In a first aspect, an embodiment of the present application provides a video processing method, which includes:
receiving a first input for a first interface; the first interface comprises a video playing window;
in response to the first input, displaying the first interface update as a first sub-interface and a second sub-interface; the first sub-interface comprises the video playing window, and the second sub-interface comprises a video control window; the video control window is used for storing videos in different video application programs to the video control;
receiving a second input for moving the video playback window to the video control window;
and responding to the second input, and saving the target video which is playing in the video playing window to the video control.
In a second aspect, an embodiment of the present application provides a video processing apparatus, which includes:
a first receiving module for receiving a first input for a first interface; the first interface comprises a video playing window;
the updating module is used for responding to the first input and updating and displaying the first interface into a first sub-interface and a second sub-interface; the first sub-interface comprises the video playing window, and the second sub-interface comprises a video control window; the video control window is used for storing videos in different video application programs to the video control;
a second receiving module, configured to receive a second input for moving the video playing window to the video control window;
and the storage module is used for responding to the second input and storing the target video which is played in the video playing window to the video control.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, and when executed by the processor, the program or instructions implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In the embodiment of the application, in the process that a user plays a video by using an electronic device, the video playing interface is updated and displayed into a first sub-interface and a second sub-interface based on input of the user, the first sub-interface comprises a video playing window, the second sub-interface comprises a video control window, and as the video control window is used for the user to store videos in different video applications into the video control, at this time, the videos which are being played in the video playing window can be stored into the video control by the user through the input of moving the video playing window to the video control window, so that videos which need to be stored in different video applications by the user are uniformly stored into the video control, and the uniform control of the videos in different applications is realized.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description, serve to explain the principles of the invention.
Fig. 1 is a flowchart of a video processing method according to an embodiment of the present application;
fig. 2 to 4 are schematic display interfaces of an electronic device according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of a video processing apparatus according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of an electronic device according to another embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments that can be derived by one of ordinary skill in the art from the embodiments given herein are intended to be within the scope of the present disclosure.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The following describes in detail an application setting method provided in the embodiments of the present application through specific embodiments and application scenarios thereof with reference to the accompanying drawings.
Please refer to fig. 1, which is a flowchart illustrating a video processing method according to an embodiment of the present application. The method can be applied to electronic equipment, and the electronic equipment can be a mobile phone, a tablet computer, a notebook computer and the like. As shown in fig. 1, the video processing method may include the following steps S1100 to S1400, which will be described in detail below.
Step S1100, a first input for a first interface is received.
The first interface includes a video playback window.
The video playing window is used for providing video playing pictures, and the video playing pictures can be displayed after the video application program is started. Generally, after a video playing application program is started, a video playing picture is displayed on a first interface in a full screen mode, that is, the first interface only includes a video playing window.
The first input to the first interface may be: and continuously clicking the touch input of the first interface for a set number of times within the preset time. Wherein the preset time and the set times are preset numerical values.
Exemplarily, as shown in fig. 2, the user enters the video application to play the video, and at this time, the first interface only includes the video playing window, that is, the first interface displays the video playing screen in a full screen mode, where the user may continuously click the first interface 2 times within 1 second.
After receiving the first input for the first interface, entering:
step S1200, responding to the first input, and updating and displaying the first interface into a first sub-interface and a second sub-interface.
The first sub-interface comprises a video playing window, and the second sub-interface comprises a video control window.
The video control window is used for storing videos in different video applications to the video control.
In this embodiment, the association relationship between the video control and different video applications may be established in advance. The association relationship between the video control and different video applications can be established through an interface in the video control.
The video control window includes a first video control sub-window and a second video control sub-window.
The first video control sub-window is used for collecting videos in different video applications to the video control, and it can be understood that when a user plays videos in the video applications, the user needs to collect favorite videos.
The second video control sub-window is used to add video in a different video application to the video control, it being understood that when the user plays video in the video application, the user needs to record what is, among other things, viewing the completed video.
Illustratively, when the user clicks the first interface twice within 1 second, the display interface shown in fig. 3 may be displayed, that is, the first interface in fig. 2 is updated to a first sub-interface and a second sub-interface, where the first sub-interface is the left interface shown in fig. 3, the first sub-interface includes a video playing window, the second sub-interface is the right interface shown in fig. 3, and the second sub-interface includes a video control window. In fig. 3, the video control window includes a first video control sub-window and a second video control sub-window, where the first video control sub-window is located above the second sub-interface, and the second video sub-window is located below the second sub-interface. Illustratively, a first video control sub-window may have prompt information "drag directly to here, the video will be collected to the video control" displayed therein, and a second video sub-window may have prompt information "drag directly to here, the video will be added to the video control".
After displaying the first interface update as the first sub-interface and the second sub-interface in response to the first input, entering:
step S1300, receiving a second input for moving the video playing window to the video control window.
In one example, the second input is for moving a video playback window displayed in the second sub-interface to the first video control sub-window.
In this example, the receiving of the second input for moving the video playing window to the video control window in step S1300 may further include: and receiving a second input of moving the video playing window displayed in the second sub-interface to the first video control sub-window.
In this example, as shown in fig. 4, when it is necessary to collect the video played by the video playing window to the video component, the user may drag the video playing window displayed in the second sub-interface to the first video control sub-window.
In one example, the second input can also be used to move a video playback window displayed in the second sub-interface to a second video control sub-window.
In this example, the receiving of the second input for moving the video playing window to the video control window in step S1300 may further include: and receiving a second input of moving the video playing window displayed in the second sub-interface to the second video control sub-window.
In this example, as shown in fig. 4, when a video played in a video playing window that is not viewed needs to be added to a video component, a user may drag the video playing window displayed in the second sub-interface into the second video control sub-window.
After receiving a second input for moving the video playback window to the video control window, entering:
step S1400, in response to the second input, storing the target video being played in the video playing window to the video control.
In one example, where a second input is used to move a video playback window displayed in the second sub-interface to the first video control sub-window, the electronic device can collect a target video being played in the video playback window into the video control in response to the second input.
In one example, where a second input is used to move a video playback window displayed in the second sub-interface to the second video control sub-window, the electronic device can add the target video being played in the video playback window to the video control in response to the second input.
According to the method of the embodiment, in the process that a user plays videos by using the electronic device, the video playing interface is updated and displayed into the first sub-interface and the second sub-interface based on input of the user, the first sub-interface comprises the video playing window, the second sub-interface comprises the video control window, and as the video control window is used for the user to store videos in different video applications into the video control, at the moment, the videos which are played in the video playing window can be stored into the video control by moving the video playing window to the input port of the video control window by the user, so that videos which need to be stored in different video applications by the user are uniformly stored into the video control, and uniform control over the videos in different applications is achieved.
In one embodiment, after saving the target video being played in the video playing window to the video control according to the above step S1400, the video processing method of the embodiment of the present disclosure further includes the following steps S2100 to S2200:
step S2100, a third input for the video control is received.
Videos stored in different video applications by a user are recorded in the video control, and a relationship is pre-established between the video control and the different video applications.
Typically, icons of video controls are displayed on a display interface of the electronic device. When the user makes a click input with respect to the icon of the video control, the electronic device may respond to the input.
In step S2200, in response to a third input, the first video list and the second video list are displayed.
In this embodiment, when the user performs a click input with respect to the icon of the video control, the electronic device may respond to the input to further display the first video list and the second video list. The first video list comprises videos in different application programs collected by the user, and the second video list comprises videos in different application programs added by the user.
The first video list includes videos that are collected into different applications of the video control, and the second video list includes videos that are added to different applications of the video control.
It can be understood that, when the user drags the video playing window displayed in the first display sub-interface to the first video control sub-window, the target video currently being played by the video playing window can be collected into the first video list.
It can be understood that, in the case that the user drags the video playing window displayed in the first display sub-interface to the second video control sub-window, the target video currently being played by the video playing window can be added to the second video list.
According to the embodiment, the user can operate the video control to display the first display list of the videos in the different collected application programs and the second display list of the videos in the different added application programs, so that the user can further perform personalized operation on the videos.
In one embodiment, after the first video list and the second video list are displayed in response to the third input in step S2200 above is performed, the video processing method of the embodiment of the present disclosure may further include steps S3100 to S3200 as follows:
in step S3100, in a case where the first video list includes the target video, a fourth input for the target video is received.
In this embodiment, when the target video is collected to the video control, the target video is located in the first video list, and when the target video is located in the first video list, the user may further perform click input on the target video.
Step S3200, in response to the fourth input, invoking the target video application to replay the video.
The target video application is a video application that previously played the target video.
In this embodiment, after the user performs the click input on the target video, the electronic device may call the target application program to replay the target video.
According to the embodiment, when the target video is collected to the video control, the target video can be further replayed through the video control.
In one embodiment, after the first video list and the second video list are displayed in response to the third input in the step S2200 above is performed, the video processing method of the embodiment of the present disclosure may further include the following steps S4100 to S4200:
in step S4100, in a case where the second video list includes the target video, a fifth input for the target video is received.
In this embodiment, when the target video is added to the video control, the target video is located in the second video list, and when the target video is located in the second video list, the user may further perform click input on the target video.
In step S4200, in response to the fifth input, the current playing time of the video is acquired.
In this embodiment, after the user performs the click input on the target video, the electronic device may first obtain the current playing time of the target video.
Step S4300, call the application program of the target video to continue playing the target video based on the current playing time.
In this embodiment, after the current playing time of the target video is obtained, the target video application program may be called to continue playing the target video based on the current playing time.
According to the embodiment, when the target video is added to the video control, the target video can be further continuously played through the video control based on the last playing time point.
In one embodiment, after the first video list and the second video list are displayed in response to the third input in the step S2200 above, the video processing method of the embodiment of the present disclosure may further include the following steps S5100 to S5200:
in step S5100, in a case where the first video list includes the target video, a sixth input for the target video is received.
In this embodiment, when the target video is collected to the video control, the target video is located in the first video list, and when the target video is located in the first video list, the user may further perform click input on the target video.
In step S5200, in response to the sixth input, the target video is shared.
In this embodiment, after the user performs click input on the target video, the sharing object can be displayed, so that the user can conveniently share the target video to the target object.
According to the embodiment, when the target video is added to the video control, the target video can be further shared through the video control.
Next, a video processing method in an example is shown, in which the video processing method may include the steps of:
step S6100, displaying a first interface, wherein the first interface includes a video playing window.
In step S6200, a first input for a video playing window in the first interface is received.
In step S6200, the user may continuously click 2 times in 1 second for the video playing window in the first interface as a first input.
Step S6300, in response to the first input, updating and displaying the first interface as a first sub-interface and a second sub-interface.
The first sub-interface includes a video playback window.
The second sub-interface includes a video control window, wherein the video control window includes a first video control sub-window and a second video control sub-window. The video control sub-window is used for collecting videos in different video applications to the video control, and the second video control sub-window is used for adding videos in different video applications to the video control.
In step S6300, as shown in fig. 2, the video playing window in the first interface may be continuously clicked for 2 times in 1 second, as shown in fig. 3, the electronic device may update and display the first interface as a first sub-interface and a second sub-interface, where the first sub-interface includes the video playing window, and the second sub-interface includes the first video control sub-window and the second video control sub-window.
And S6400, receiving a second input for moving the video playing window displayed in the second sub-interface to the first video control sub-window. Or, receiving a second input for moving the video playing window displayed in the second sub-interface to the second video control sub-window.
Step S6500, in response to a second input for moving the video playing window displayed in the second sub-interface to the first video control sub-window, collecting the target video being played in the video playing window to the video control. Or, in response to a second input to move the video playback window displayed in the second sub-interface to the second video control sub-window, adding the target video being played in the video playback window to the video control.
At step S6600, a third input for a video control is received.
Step S6700, in response to a third input, displaying the first video list and the second video list.
The first video list includes videos that are collected into different applications of the video control, and the second video list includes videos that are added to different applications of the video control.
Step S6800, in case that the target video is located in the first video list, receiving a fourth input for the target video, and calling the target video application program to replay the target video. Or, receiving a sixth input aiming at the target video under the condition that the target video is located in the first video list, and sharing the target video.
Step S6900, receiving a fifth input for the target video when the target video is in the second video list, obtaining a current playing time of the target video, and calling the target video application program to continue playing the target video based on the current playing time.
According to the example, when the user plays the video by using the electronic equipment, the user can enter the collection and addition interface through the quick click operation of the playing interface, and the user collects or adds the video through the simple dragging gesture of the user. The user enters the video control on the desktop of the terminal device, and the video collected by the user is searched in the collection list in the video control, so that the user can watch or share the video again for multiple times. The video added by the user is searched in the adding list in the video control, so that the user can conveniently continue to watch the video which is not seen completely. The method can solve the problems that the video resources on the existing electronic equipment are not concentrated, the user needs to re-enter a certain specific video application program or website to watch and search the video, the video which is liked by the user is searched through one entrance to watch or share again, and the video which is not watched is recorded through the same entrance and supports jumping to continue playing.
Corresponding to the above embodiments, referring to fig. 5, an embodiment of the present application further provides a video processing apparatus 500, including:
a first receiving module 510 for receiving a first input for a first interface; the first interface includes a video playback window.
An update module 520, configured to update and display the first interface as a first sub-interface and a second sub-interface in response to the first input; the first sub-interface comprises the video playing window, and the second sub-interface comprises a video control window; the video control window is used for storing videos in different video applications to the video control.
A second receiving module 530, configured to receive a second input for moving the video playing window to the video control window.
A storage module 540, configured to, in response to the second input, store the target video being played in the video playing window to the video control.
In one embodiment, the video control window includes a first video control sub-window and a second video control sub-window.
The second input is used for moving the video playing window to the first video control sub-window; the first video control sub-window is used for collecting videos in different video applications to the video controls; alternatively, the first and second electrodes may be,
the second input is used for moving the video playing window to the second video control sub-window; the second video control sub-window is used for adding videos in different video applications to the video controls.
In one embodiment, the apparatus 500 further comprises a third receiving module and a display module (neither shown in the figure).
A third receiving module to receive a third input for the video control.
A display module to display the first video list and the second video list in response to the third input.
Wherein the first video list comprises videos collected in different applications of the video control, and the second video list comprises videos added to different applications of the video control
In one embodiment, the apparatus 500 further comprises a fourth receiving module and a first control module (not shown).
A fourth receiving module to receive a fourth input for the target video if the first video list includes the target video.
And the first control module is used for responding to the fourth input and calling a target video application program to replay the target video.
In one embodiment, the apparatus 500 further comprises a fifth receiving module, an obtaining module, and a second control module (none shown).
A fifth receiving module, configured to receive a fifth input for the target video if the second video list includes the target video.
And the acquisition module is used for responding to the fifth input and acquiring the current playing time point of the target video.
And the second control module is used for calling a target video application program to continue playing the target video based on the current playing time.
The video processing apparatus in the embodiment of the present application may be an apparatus, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The video processing apparatus in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The video processing apparatus provided in the embodiment of the present application can implement each process implemented by the foregoing method embodiment, and is not described here again to avoid repetition.
Corresponding to the foregoing embodiments, optionally, as shown in fig. 6, an electronic device 600 is further provided in this embodiment of the present application, and includes a processor 601, a memory 602, and a program or an instruction stored in the memory 602 and capable of running on the processor 601, where the program or the instruction is executed by the processor 601 to implement each process of the foregoing video processing method embodiment, and can achieve the same technical effect, and no further description is provided here to avoid repetition.
It should be noted that the electronic device in the embodiment of the present application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 7 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 700 includes, but is not limited to: a radio frequency unit 701, a network module 702, an audio output unit 703, an input unit 704, a sensor 705, a display unit 706, a user input unit 707, an interface unit 708, a memory 709, and a processor 710.
Those skilled in the art will appreciate that the electronic device 700 may also include a power supply (e.g., a battery) for powering the various components, and the power supply may be logically coupled to the processor 710 via a power management system, such that the functions of managing charging, discharging, and power consumption may be performed via the power management system. The electronic device structure shown in fig. 7 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is omitted here.
A user input unit 707 configured to receive a first input for a first interface; the first interface includes a video playback window.
A processor 710 for displaying the first interface update as a first sub-interface and a second sub-interface in response to the first input; the first sub-interface comprises the video playing window, and the second sub-interface comprises a video control window; the video control window is used for storing videos in different video applications to the video control.
The user input unit 707 is further configured to receive a second input for moving the video playing window to the video control window.
The processor 710 is further configured to save the target video being played in the video playing window to the video control in response to the second input.
In one embodiment, the second input is for moving the video playback window to the first video control sub-window; the first video control sub-window is used for collecting videos in different video applications to the video controls; alternatively, the first and second electrodes may be,
the second input is used for moving the video playing window to the second video control sub-window; the second video control sub-window is used for adding videos in different video applications to the video controls.
In one embodiment, the user input unit 707 is further configured to receive a third input for the video control.
A display unit 706 for displaying the first video list and the second video list in response to the third input; wherein the first video list comprises videos collected in different applications of the video control, and the second video list comprises videos added to different applications of the video control.
In one embodiment, the user input unit 707 is further configured to receive a fourth input for the target video if the first video list includes the target video.
The processor 710 is further configured to invoke a target video application to replay the target video in response to the fourth input.
In one embodiment, the user input unit 707 is further configured to receive a fifth input for the target video if the second video list includes the target video.
The processor 710 is further configured to, in response to the fifth input, obtain a current playing time of the target video; and calling a target video application program to continue playing the target video based on the current playing time.
It should be understood that in the embodiment of the present application, the input Unit 704 may include a Graphics Processing Unit (GPU) 7041 and a microphone 7042, and the Graphics Processing Unit 7041 processes image data of still pictures or videos obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The display unit 706 may include a display panel 7061, and the display panel 7061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 707 includes a touch panel 7071 and other input devices 7072. The touch panel 7071 is also referred to as a touch screen. The touch panel 7071 may include two parts of a touch detection device and a touch controller. Other input devices 7072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. Memory 709 may be used to store software programs as well as various data, including but not limited to applications and operating systems. Processor 710 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 710.
The embodiments of the present application further provide a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the video processing method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or an instruction to implement each process of the above video processing method embodiment, and can achieve the same technical effect, and the details are not repeated here to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a computer software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. A video processing method, comprising:
receiving a first input for a first interface; the first interface comprises a video playing window;
in response to the first input, displaying the first interface update as a first sub-interface and a second sub-interface; the first sub-interface comprises the video playing window, and the second sub-interface comprises a video control window; the video control window is used for storing videos in different video application programs to the video control;
receiving a second input for moving the video playback window to the video control window;
and responding to the second input, and saving the target video which is playing in the video playing window to the video control.
2. The method of claim 1, wherein the video control window comprises a first video control sub-window and a second video control sub-window;
the second input is used for moving the video playing window to the first video control sub-window; the first video control sub-window is used for collecting videos in different video applications to the video controls; alternatively, the first and second electrodes may be,
the second input is used for moving the video playing window to the second video control sub-window; the second video control sub-window is used for adding videos in different video applications to the video controls.
3. The method of claim 2, further comprising, after saving the target video being played in the video playing window to the video control:
receiving a third input for the video control;
displaying a first video list and a second video list in response to the third input;
wherein the first video list comprises videos collected in different applications of the video control, and the second video list comprises videos added to different applications of the video control.
4. The method of claim 3, further comprising, after said displaying the first list of videos and the second list of videos:
receiving a fourth input for the target video if the first video list includes the target video;
in response to the fourth input, invoking a target video application to replay the target video.
5. The method of claim 3, further comprising, after said displaying the first list of videos and the second list of videos:
receiving a fifth input for the target video if the second video list includes the target video;
responding to the fifth input, and acquiring the current playing time of the target video;
and calling a target video application program to continue playing the target video based on the current playing time.
6. A video processing apparatus comprising:
a first receiving module for receiving a first input for a first interface; the first interface comprises a video playing window;
the updating module is used for responding to the first input and updating and displaying the first interface into a first sub-interface and a second sub-interface; the first sub-interface comprises the video playing window, and the second sub-interface comprises a video control window; the video control window is used for storing videos in different video application programs to the video control;
a second receiving module, configured to receive a second input for moving the video playing window to the video control window;
and the storage module is used for responding to the second input and storing the target video which is played in the video playing window to the video control.
7. The apparatus of claim 6, wherein the video control window comprises a first video control sub-window and a second video control sub-window;
the second input is used for moving the video playing window to the first video control sub-window; the first video control sub-window is used for collecting videos in different video applications to the video controls; alternatively, the first and second electrodes may be,
the second input is used for moving the video playing window to the second video control sub-window; the second video control sub-window is used for adding videos in different video applications to the video controls.
8. The apparatus of claim 7, further comprising:
a third receiving module for receiving a third input for the video control;
a display module to display the first video list and the second video list in response to the third input;
wherein the first video list comprises videos collected in different applications of the video control, and the second video list comprises videos added to different applications of the video control.
9. The apparatus of claim 8, further comprising:
a fourth receiving module, configured to receive a fourth input for the target video if the first video list includes the target video;
and the first control module is used for responding to the fourth input and calling a target video application program to replay the target video.
10. The apparatus of claim 8, further comprising:
a fifth receiving module, configured to receive a fifth input for the target video if the second video list includes the target video;
the acquisition module is used for responding to the fifth input and acquiring the current playing time of the target video;
and the second control module is used for calling a target video application program to continue playing the target video based on the current playing time.
CN202111012297.XA 2021-08-31 2021-08-31 Video processing method and device Pending CN113721816A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111012297.XA CN113721816A (en) 2021-08-31 2021-08-31 Video processing method and device
PCT/CN2022/115453 WO2023030234A1 (en) 2021-08-31 2022-08-29 Video processing method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111012297.XA CN113721816A (en) 2021-08-31 2021-08-31 Video processing method and device

Publications (1)

Publication Number Publication Date
CN113721816A true CN113721816A (en) 2021-11-30

Family

ID=78679732

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111012297.XA Pending CN113721816A (en) 2021-08-31 2021-08-31 Video processing method and device

Country Status (2)

Country Link
CN (1) CN113721816A (en)
WO (1) WO2023030234A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023030234A1 (en) * 2021-08-31 2023-03-09 维沃移动通信有限公司 Video processing method and apparatus

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106878464A (en) * 2017-03-31 2017-06-20 努比亚技术有限公司 A kind of document display method and device
CN110099296A (en) * 2019-03-27 2019-08-06 维沃移动通信有限公司 A kind of information display method and terminal device
CN110851098A (en) * 2019-10-31 2020-02-28 维沃移动通信有限公司 Video window display method and electronic equipment
CN111142724A (en) * 2019-12-24 2020-05-12 维沃移动通信有限公司 Display control method and electronic equipment
CN111857508A (en) * 2020-07-17 2020-10-30 维沃移动通信有限公司 Task management method and device and electronic equipment
CN111880694A (en) * 2020-07-22 2020-11-03 维沃移动通信有限公司 Display method, device, equipment and storage medium
CN111966860A (en) * 2020-07-31 2020-11-20 维沃移动通信(杭州)有限公司 Audio playing method and device and electronic equipment
CN112887802A (en) * 2021-01-27 2021-06-01 维沃移动通信有限公司 Video access method and device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015013338A2 (en) * 2013-07-26 2015-01-29 Cv Studios Entertainment, Inc. Enhanced mobile video platform
CN113157230A (en) * 2019-09-18 2021-07-23 华为技术有限公司 Data transmission method and related equipment
CN113721816A (en) * 2021-08-31 2021-11-30 维沃移动通信有限公司 Video processing method and device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106878464A (en) * 2017-03-31 2017-06-20 努比亚技术有限公司 A kind of document display method and device
CN110099296A (en) * 2019-03-27 2019-08-06 维沃移动通信有限公司 A kind of information display method and terminal device
CN110851098A (en) * 2019-10-31 2020-02-28 维沃移动通信有限公司 Video window display method and electronic equipment
CN111142724A (en) * 2019-12-24 2020-05-12 维沃移动通信有限公司 Display control method and electronic equipment
CN111857508A (en) * 2020-07-17 2020-10-30 维沃移动通信有限公司 Task management method and device and electronic equipment
CN111880694A (en) * 2020-07-22 2020-11-03 维沃移动通信有限公司 Display method, device, equipment and storage medium
CN111966860A (en) * 2020-07-31 2020-11-20 维沃移动通信(杭州)有限公司 Audio playing method and device and electronic equipment
CN112887802A (en) * 2021-01-27 2021-06-01 维沃移动通信有限公司 Video access method and device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023030234A1 (en) * 2021-08-31 2023-03-09 维沃移动通信有限公司 Video processing method and apparatus

Also Published As

Publication number Publication date
WO2023030234A1 (en) 2023-03-09

Similar Documents

Publication Publication Date Title
CN113596555B (en) Video playing method and device and electronic equipment
CN112540740A (en) Split screen display method and device, electronic equipment and readable storage medium
CN112887802A (en) Video access method and device
CN112911401A (en) Video playing method and device
CN112099704A (en) Information display method and device, electronic equipment and readable storage medium
CN112083854A (en) Application program running method and device
CN112486444A (en) Screen projection method, device, equipment and readable storage medium
CN112148178A (en) Application switching method and device, electronic equipment and readable storage medium
CN112911147A (en) Display control method, display control device and electronic equipment
CN111813305A (en) Application program starting method and device
CN112181252B (en) Screen capturing method and device and electronic equipment
CN112399010B (en) Page display method and device and electronic equipment
CN113835577A (en) Display method, display device, electronic equipment and storage medium
CN112286611B (en) Icon display method and device and electronic equipment
WO2023030234A1 (en) Video processing method and apparatus
CN113485813A (en) Application skipping method and device
CN115550741A (en) Video management method and device, electronic equipment and readable storage medium
CN113805997B (en) Information display method, information display device, electronic equipment and storage medium
CN112202958B (en) Screenshot method and device and electronic equipment
CN113794920A (en) Interface display method and device, electronic equipment and medium
CN113641839A (en) Multimedia file searching method and device
CN113849721A (en) Buffer processing method and device
CN113807831A (en) Payment method and device
CN112765508A (en) Information display method and device and electronic equipment
CN112818094A (en) Chat content processing method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination