WO2019196929A1 - Terminal mobile et procédé de traitement de données vidéo - Google Patents

Terminal mobile et procédé de traitement de données vidéo Download PDF

Info

Publication number
WO2019196929A1
WO2019196929A1 PCT/CN2019/082452 CN2019082452W WO2019196929A1 WO 2019196929 A1 WO2019196929 A1 WO 2019196929A1 CN 2019082452 W CN2019082452 W CN 2019082452W WO 2019196929 A1 WO2019196929 A1 WO 2019196929A1
Authority
WO
WIPO (PCT)
Prior art keywords
input
text
video
data
video playing
Prior art date
Application number
PCT/CN2019/082452
Other languages
English (en)
Chinese (zh)
Inventor
顾瀚之
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Publication of WO2019196929A1 publication Critical patent/WO2019196929A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4126The peripheral being portable, e.g. PDAs or mobile phones
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • H04N21/42224Touch pad or touch panel provided on the remote control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/443OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB
    • H04N21/4438Window management, e.g. event handling following interaction with the user interface
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47217End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for controlling playback functions for recorded or on-demand content, e.g. using progress bars, mode or play-point indicators or bookmarks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/475End-user interface for inputting end-user data, e.g. personal identification number [PIN], preference data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/488Data services, e.g. news ticker
    • H04N21/4884Data services, e.g. news ticker for displaying subtitles

Definitions

  • the embodiments of the present disclosure relate to the field of communications technologies, and in particular, to a video data processing method and a mobile terminal.
  • the embodiments of the present disclosure provide a video data processing method and a mobile terminal to solve the problem that the operation is relatively cumbersome when editing the text content related to the video during video playback.
  • an embodiment of the present disclosure further provides a video data processing method, where the method includes:
  • the target data is output according to a data processing manner corresponding to the second input.
  • an embodiment of the present disclosure further provides a mobile terminal, where the mobile terminal includes:
  • a first receiving unit configured to receive a first input of the operating body on the video playing interface during video playing
  • a first display unit configured to pause playback of the video in response to the first input, and display a text editing area and a text display area on the video playing interface;
  • a second receiving unit configured to receive text data input by the user in the text editing area
  • a second display unit configured to display the text data in the text editing area and the text display area respectively;
  • a third receiving unit configured to receive a second input of the user
  • an output unit configured to output the target data according to the second data input, based on the text data and the first image displayed by the video playing interface, according to a data processing manner corresponding to the second input.
  • an embodiment of the present disclosure further provides a mobile terminal, including a processor, a memory, and a video data processing program stored on the memory and operable on the processor, where the video data processing program is The step of implementing the video data processing method described above when the processor executes.
  • an embodiment of the present disclosure further provides a computer readable storage medium, where the video data processing program is stored, and the video data processing program is implemented by a processor to implement the video data processing method. A step of.
  • the mobile terminal user may perform an input operation on the video playing interface of the mobile terminal by the operating body to trigger the mobile terminal to pause the currently playing video, and then the user may pause the video.
  • the text data input is performed on the video playing interface.
  • the user can trigger the mobile terminal to output different forms of the first image displayed by the text data and the video playing interface by inputting different second inputs. Realize editing of different forms of text content on the video playback interface.
  • different forms of editing of the text content related to the video image may be performed on the video playing interface, and the operation is relatively simple and time-saving.
  • FIG. 1 is a flowchart of a video data processing method according to an embodiment of the present disclosure
  • FIG. 2 is a schematic diagram of an example of a video data processing method according to an embodiment of the present disclosure
  • FIG. 3 is a second example diagram of a video data processing method according to an embodiment of the present disclosure.
  • FIG. 4 is a third example of a video data processing method according to an embodiment of the present disclosure.
  • FIG. 5 is a fourth example of a video data processing method according to an embodiment of the present disclosure.
  • FIG. 6 is a fifth example of a video data processing method according to an embodiment of the present disclosure.
  • FIG. 7 is a sixth example of a video data processing method according to an embodiment of the present disclosure.
  • FIG. 8 is a seventh example of a video data processing method according to an embodiment of the present disclosure.
  • FIG. 9 is a schematic structural diagram of a mobile terminal according to an embodiment of the present disclosure.
  • FIG. 10 is a schematic diagram of a hardware structure of a mobile terminal that implements various embodiments of the present disclosure.
  • Embodiments of the present disclosure provide a video data processing method and a mobile terminal.
  • the video data processing method provided by the embodiment of the present disclosure is first introduced.
  • the video data processing method provided by the embodiment of the present disclosure is applicable to a mobile terminal.
  • the mobile terminal may include: a smart phone, a tablet computer, and the like.
  • FIG. 1 is a flowchart of a video data processing method according to an embodiment of the present disclosure. As shown in FIG. 1, the method may include the following steps:
  • step 101 during the video playing, the first input of the operating body on the video playing interface is received.
  • the operating body may include: a finger and a stylus.
  • the first input is used to trigger the mobile terminal to pause the playing of the current video and display the text editing area and the text display area on the video playing interface.
  • the video A preset control is displayed on the play interface, and the preset control is used to perform the operation of playing the paused video and displaying the text editing area and the text display area on the video playing interface.
  • the step 101 may specifically include the following steps: Receiving the first input of the operating body on the preset control.
  • the first input may include: a click operation, a long press operation, and the like.
  • the right side of the video playing interface 21 displayed on the screen of the mobile terminal 20 is provided with a preset control 211.
  • the control 211 is set to trigger the mobile terminal 20 to pause the playing of the current video and display the text editing area and the text display area on the video playing interface 21.
  • the user when the operating body is a finger, the user can also trigger the mobile terminal to pause the playing of the current video and display the text editing area and the text display area on the video playing interface by interacting with the screen of the mobile terminal.
  • the first input may include: the finger slides on the screen of the mobile terminal according to a specific sliding manner; in an actual application, the specific sliding manner may include: three fingers sliding up on the screen at the same time, or two fingers simultaneously on the screen Sliding, the embodiment of the present disclosure does not limit this.
  • the user can simultaneously slide up the video playing interface 31 displayed on the screen of the mobile terminal 30 by two fingers 32 to trigger the mobile terminal 30 to pause the playing of the current video.
  • a text editing area and a text display area are displayed on the video playing interface 31.
  • the embodiment of the present disclosure can provide multiple trigger modes for the user to trigger the mobile terminal to pause the playback of the current video and display the text editing area and the text display area on the video playing interface to meet the diverse needs of the user.
  • Step 102 In response to the first input, pause playing of the video, and display a text editing area and a text display area on the video playing interface.
  • a text editing area and a text display area may be displayed on the video playing interface, wherein the text editing area is configured to receive and display text data input by the user, and the text display The area is used to display the text data received by the text editing area.
  • a preset text editing control may be displayed on the video playing interface of the mobile terminal; wherein the text editing control may include at least one of the following items: Controls for data, controls for undo editing operations, controls for forward editing operations, and controls for exiting the editing interface.
  • the method may further add the following steps: receiving a third input of the operating body on the text editing control, and executing the editing operation corresponding to the text editing control in response to the third input.
  • the third input may include: a click operation.
  • multiple controls for editing text data may be provided on the video playing interface, so that the user can perform various editing (eg, delete, undo, advance, etc.) on the text data, and the user only needs to perform
  • a simple control click operation can trigger the mobile terminal to complete the corresponding text editing operation, thereby increasing the convenience and flexibility of the user operation.
  • the play information of the video may also be displayed on the video play interface; wherein the play information may include at least one of the following: a video name, a total duration of the video, and a play progress of the video.
  • the play information of the video can be displayed on the video play interface, so that the user can understand the currently paused video.
  • the mobile terminal 40 pauses the video being played, and displays a text editing area 411 and a text display area 412 on the video playing interface 41.
  • the video is displayed in the upper left corner of the video playing interface 41.
  • the name "Star Wars”, the total duration of the video "02:00:00” and the playback progress of the video "00:02:00”, the video playback interface 31 has a plurality of text editing controls in the upper right corner, respectively for A control 413 for deleting the text data, a control 414 for canceling the editing operation, and a control 415 for the forward editing operation, the control 416 for exiting the editing interface is displayed in the lower right corner of the video playing interface 41.
  • Step 103 Receive text data input by the user in the text editing area.
  • the user can input text data in the text editing area using a finger, or input text data in the text editing area using a stylus.
  • Step 104 Display text data in the text editing area and the text display area, respectively.
  • the user writes the text data "good" in the text editing area 511 displayed on the video playing interface 51 of the mobile terminal 50, and waits for t time after writing, and the mobile terminal 50 automatically places the user.
  • the written text data is converted from image text to electronic text and appears in the text display area 512.
  • the controls 513-516 displayed in the video playing interface 51 are similar to the controls in FIG. 4, and are not described herein again.
  • the user can delete the rewrite by clicking on the control 413 for deleting the text data.
  • other controls can also be performed by finger tapping.
  • Step 105 Receive a second input of the user.
  • the user may trigger the mobile terminal to output the text data and the first image displayed by the video playing interface in different data processing manners by inputting different second inputs.
  • the second input may be the fingerprint information of the user, where each data processing manner maps at least one type of fingerprint information; or the second input is a touch gesture, wherein each data processing manner maps at least one touch gesture .
  • multiple input modes can be supported to trigger the mobile terminal to output the text data and the first image displayed by the video playing interface in different data processing manners to meet the diversified needs of the user.
  • the fingerprints of different fingers may respectively map a data processing manner.
  • the data processing manner may include: outputting a note, outputting a caption, and outputting a bullet.
  • the fingerprint playing area 617 is displayed on the video playing interface 61 of the mobile terminal 60, and the user can press the fingerprint by using the finger with the fingerprint pre-recorded.
  • the area 617 performs fingerprint input, and after receiving the fingerprint, the mobile terminal 60 uses the data processing method corresponding to the fingerprint to perform target data output.
  • the fingerprint of the right index finger is used to trigger the output of the note
  • the fingerprint of the right middle finger is used to trigger the output of the subtitle.
  • the fingerprint of the right ring finger triggers the output barrage.
  • the display 611-616 on the video playing interface 61 is similar to the 411-416 in FIG. 4, and details are not described herein again.
  • Step 106 In response to the second input, outputting the target data according to the first image displayed by the text data and the video playing interface according to the data processing manner corresponding to the second input.
  • the first image displayed by the text data and the video playing interface may be outputted according to the data processing manner corresponding to the second input, and the note, the subtitle or the barrage may be output.
  • the foregoing step 106 may specifically include the following steps:
  • the note data is generated and output based on the first image displayed by the text data and the video play interface; wherein the note data includes the text data and the first image.
  • the user can realize the recording of the note without exiting the video, the operation is relatively simple, and the operation time of recording the note when watching the video is shortened, thereby improving the user experience.
  • the subtitle setting area is displayed on the video playing interface; in this case, the foregoing step 106 may specifically include the following steps:
  • the text data is output in the form of subtitles according to the first display parameter.
  • the user can implement the editing of the subtitles without exiting the video, and the operation is relatively simple, and the user experience is improved.
  • the first display parameter includes at least one of the following: a time when the text data appears, a displayed position, a font, a font size, and a font color.
  • the video playing interface 71 of the mobile terminal 70 displays a caption setting area 711. Above the caption setting area 711, a control for saving the caption is displayed as "Save as subtitle", and the middle display is useful.
  • the time setting area for setting the subtitle occurrence time period the user can input time information in the time setting area, that is, how many seconds are moved forward at the current time, and the subtitle is displayed at that moment; how many seconds are moved backward, and the subtitle disappears at that moment. .
  • the user can save the subtitle information by clicking the control "Save as subtitle”.
  • the controls 712-714 on the video playing interface 71 are similar to the controls in FIG. 4, and are not described herein again.
  • the step 105 When the barrage is output, after the step 105, the following steps may be added: on the video playing interface, the barrage setting area is displayed; at this time, the step 106 may specifically include the following steps:
  • the text data is output in the form of a barrage.
  • the user can implement the editing of the barrage without exiting the video, and the operation is relatively simple, and the user experience is improved.
  • the second display parameter includes at least one of the following: the time when the text data appears, the displayed position, the speed of the movement, the font, the font size, and the font color.
  • the video playing interface 81 of the mobile terminal 80 displays a barrage setting area 811, and a control for confirming the saving of the barrage is displayed as "saving as a barrage" above the barrage setting area 811.
  • the middle portion displays a duration setting area for setting the duration of the barrage, wherein the user can input the duration information in the duration setting area, that is, how many seconds after the video is displayed at this time.
  • the mobile terminal 80 calculates the speed according to the width W of the video playing interface 81 and the length of time input by the user, and the barrage is loaded at this speed at this time during playback. After completing the setting of the duration of the barrage, the user can save the barrage information by clicking the control "Save as Barrage".
  • the controls 812-814 on the video playing interface 81 are similar to the controls in FIG. 4, and are not described herein again.
  • the mobile terminal user can perform an input operation on the video playing interface of the mobile terminal by the operating body to trigger the mobile terminal to pause the currently playing video, and then the user
  • the text data input can be performed on the paused video playing interface.
  • the user can trigger the mobile terminal to perform different forms on the text data and the first image displayed on the video playing interface by inputting different second inputs.
  • the output enables the editing of different forms of text content on the video playback interface.
  • different forms of editing of the text content related to the video image may be performed on the video playing interface, and the operation is relatively simple and time-saving.
  • FIG. 9 is a schematic structural diagram of a mobile terminal according to an embodiment of the present disclosure.
  • the mobile terminal 900 may include: a first receiving unit 901, a first display unit 902, a second receiving unit 903, and a second display unit 904. a third receiving unit 905 and an output unit 906, wherein
  • the first receiving unit 901 is configured to receive a first input of the operating body on the video playing interface during the video playing process
  • the first display unit 902 is configured to pause playback of the video in response to the first input, and display a text editing area and a text display area on the video playing interface;
  • a second receiving unit 903 configured to receive text data input by the user in the text editing area
  • a second display unit 904 configured to respectively display the text data in the text editing area and the text display area;
  • a third receiving unit 905, configured to receive a second input of the user
  • the output unit 906 is configured to output target data according to the second input, based on the text data and the first image displayed by the video playing interface, according to a data processing manner corresponding to the second input.
  • the mobile terminal user can perform an input operation on the video playing interface of the mobile terminal by the operating body to trigger the mobile terminal to pause the currently playing video, and then the user
  • the text data input can be performed on the paused video playing interface.
  • the user can trigger the mobile terminal to perform different forms on the text data and the first image displayed on the video playing interface by inputting different second inputs.
  • the output enables the editing of different forms of text content on the video playback interface.
  • different forms of editing of the text content related to the video image may be performed on the video playing interface, and the operation is relatively simple and time-saving.
  • a preset control is displayed on the video playing interface, and the preset control is configured to perform pause playing of the video and display a text editing area and text on the video playing interface. Display area operation;
  • the first receiving unit 901 may include:
  • the first input receiving subunit is configured to receive a first input of the operating body on the preset control.
  • the output unit 906 may include:
  • a note output subunit configured to generate and output the note data based on the text data and the first image displayed by the video play interface
  • note data includes the text data and the first image.
  • the mobile terminal 900 may further include:
  • a third display unit configured to display a caption setting area on the video playing interface
  • the output unit 906 can include:
  • a first display parameter receiving subunit configured to receive a first display parameter of the text data input by the user in the subtitle setting area
  • a subtitle output subunit configured to output the text data in a subtitle form according to the first display parameter.
  • the mobile terminal 900 may further include:
  • a fourth display unit configured to display a barrage setting area on the video playing interface
  • the output unit 906 can include:
  • a second display parameter receiving subunit configured to receive a second display parameter of the text data input by the user in the barrage setting area
  • the barrage output subunit is configured to output the text data in the form of a barrage according to the second display parameter.
  • the second input is fingerprint information of the user, where each data processing manner maps at least one type of fingerprint information;
  • the second input is a touch gesture, wherein each data processing manner maps at least one touch gesture.
  • FIG. 10 is a schematic diagram of a hardware structure of a mobile terminal that implements various embodiments of the present disclosure.
  • the mobile terminal 1000 includes, but is not limited to, a radio frequency unit 1001, a network module 1002, an audio output unit 1003, and an input unit 1004.
  • the mobile terminal structure shown in FIG. 10 does not constitute a limitation of the mobile terminal, and the mobile terminal may include more or less components than those illustrated, or combine some components, or different components. Arrangement.
  • the mobile terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palmtop computer, an in-vehicle terminal, a wearable device, a pedometer, and the like.
  • the processor 1010 is configured to receive a first input of the operating body on the video playing interface during the video playing process, and pause the playing of the video in response to the first input, and on the video playing interface. Displaying a text editing area and a text display area; receiving text data input by the user in the text editing area; displaying the text data in the text editing area and the text display area respectively; receiving a second input of the user; And responding to the second input, based on the text data and the first image displayed by the video playing interface, outputting target data according to a data processing manner corresponding to the second input.
  • the mobile terminal user may perform an input operation on the video playing interface of the mobile terminal by the operating body to trigger the mobile terminal to pause the currently playing video, and then the user may pause after the video is played.
  • the text data input is performed on the video playing interface.
  • the user can trigger the mobile terminal to perform different forms of output on the text data and the first image displayed on the video playing interface by inputting different second inputs. Edit different forms of text content on the video playback interface.
  • different forms of editing of the text content related to the video image may be performed on the video playing interface, and the operation is relatively simple and time-saving.
  • a preset control is displayed on the video playing interface, where the preset control is used to perform pause playback of the video and display a text editing area and a text display on the video playing interface. Operation of the district;
  • the first input of the receiving operation body on the video playing interface includes:
  • the outputting the target data according to the data processing manner corresponding to the second input, based on the text data and the first image displayed by the video playing interface includes:
  • note data includes the text data and the first image.
  • the method further includes:
  • the text data is output in the form of subtitles according to the first display parameter.
  • the method further includes:
  • the text data is output in the form of a barrage according to the second display parameter.
  • the method further includes at least one of the following:
  • the second input is fingerprint information of the user, where each data processing manner maps at least one type of fingerprint information
  • the second input is a touch gesture, wherein each data processing manner maps at least one touch gesture.
  • the radio frequency unit 1001 can be used for receiving and transmitting signals during and after receiving or transmitting information or a call, and specifically, after receiving downlink data from the base station, processing the processor 1010; The uplink data is sent to the base station.
  • radio frequency unit 1001 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio unit 1001 can also communicate with the network and other devices through a wireless communication system.
  • the mobile terminal provides wireless broadband Internet access to the user through the network module 1002, such as helping the user to send and receive emails, browse web pages, and access streaming media.
  • the audio output unit 1003 can convert the audio data received by the radio frequency unit 1001 or the network module 1002 or stored in the memory 1009 into an audio signal and output as a sound. Moreover, the audio output unit 1003 may also provide audio output (eg, call signal reception sound, message reception sound, etc.) related to a specific function performed by the mobile terminal 1000.
  • the audio output unit 1003 includes a speaker, a buzzer, a receiver, and the like.
  • the input unit 1004 is for receiving an audio or video signal.
  • the input unit 1004 may include a graphics processing unit (GPU) 10041 and a microphone 10042, and the graphics processor 10041 images an still picture or video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode.
  • the data is processed.
  • the processed image can be displayed on the display unit 1006.
  • the image processed by the graphics processor 10041 may be stored in the memory 1009 (or other storage medium) or transmitted via the radio frequency unit 1001 or the network module 1002.
  • the microphone 10042 can receive sound and can process such sound as audio data.
  • the processed audio data can be converted to a format output that can be transmitted to the mobile communication base station via the radio unit 1001 in the case of a telephone call mode.
  • the mobile terminal 1000 also includes at least one type of sensor 1005, such as a light sensor, motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor, wherein the ambient light sensor can adjust the brightness of the display panel 10061 according to the brightness of the ambient light, and the proximity sensor can close the display panel 10061 when the mobile terminal 1000 moves to the ear. / or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in all directions (usually three axes). When it is stationary, it can detect the magnitude and direction of gravity. It can be used to identify the attitude of the mobile terminal (such as horizontal and vertical screen switching, related games).
  • sensor 1005 may also include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, Infrared sensors and the like are not described here.
  • the display unit 1006 is for displaying information input by the user or information provided to the user.
  • the display unit 10061 can include a display panel 10061.
  • the display panel 10061 can be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), or the like.
  • the user input unit 1007 can be configured to receive input numeric or character information and to generate key signal inputs related to user settings and function control of the mobile terminal.
  • the user input unit 1007 includes a touch panel 10071 and other input devices 10072.
  • the touch panel 10071 also referred to as a touch screen, can collect touch operations on or near the user (such as the user using a finger, a stylus, or the like on the touch panel 10071 or near the touch panel 10071. operating).
  • the touch panel 10071 may include two parts of a touch detection device and a touch controller.
  • the touch detection device detects the touch orientation of the user, and detects a signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts the touch information into contact coordinates, and sends the touch information.
  • the processor 1010 receives the commands from the processor 1010 and executes them.
  • the touch panel 10071 can be implemented in various types such as resistive, capacitive, infrared, and surface acoustic waves.
  • the user input unit 1007 may also include other input devices 10072.
  • the other input devices 10072 may include, but are not limited to, a physical keyboard, function keys (such as a volume control button, a switch button, etc.), a trackball, a mouse, and a joystick, which are not described herein.
  • the touch panel 10071 can be overlaid on the display panel 10061. After the touch panel 10071 detects a touch operation thereon or nearby, the touch panel 10071 transmits to the processor 1010 to determine the type of the touch event, and then the processor 1010 according to the touch. The type of event provides a corresponding visual output on display panel 10061.
  • the touch panel 10071 and the display panel 10061 are used as two independent components to implement the input and output functions of the mobile terminal in FIG. 10, in some embodiments, the touch panel 10071 and the display panel 10061 may be integrated. The input and output functions of the mobile terminal are implemented, and are not limited herein.
  • the interface unit 1008 is an interface in which an external device is connected to the mobile terminal 1000.
  • the external device may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, and an audio input/output. (I/O) port, video I/O port, headphone port, and more.
  • the interface unit 1008 can be configured to receive input from an external device (eg, data information, power, etc.) and transmit the received input to one or more components within the mobile terminal 1000 or can be used at the mobile terminal 1000 and externally Data is transferred between devices.
  • an external device eg, data information, power, etc.
  • the memory 1009 can be used to store software programs as well as various data.
  • the memory 1009 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may be stored according to Data created by the use of the mobile phone (such as audio data, phone book, etc.).
  • the memory 1009 may include a high speed random access memory, and may also include a nonvolatile memory such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
  • the processor 1010 is a control center of the mobile terminal that connects various portions of the entire mobile terminal using various interfaces and lines, by running or executing software programs and/or modules stored in the memory 1009, and recalling data stored in the memory 1009.
  • the mobile terminal performs various functions and processing data to perform overall monitoring on the mobile terminal.
  • the processor 1010 may include one or more processing units; optionally, the processor 1010 may integrate an application processor and a modem processor, wherein the application processor mainly processes an operating system, a user interface, an application, etc., and a modulation solution
  • the processor mainly handles wireless communication. It will be appreciated that the above described modem processor may also not be integrated into the processor 1010.
  • the mobile terminal 1000 may further include a power source 1011 (such as a battery) for supplying power to various components.
  • a power source 1011 such as a battery
  • the power source 1011 may be logically connected to the processor 1010 through a power management system to manage charging, discharging, and power consumption through the power management system. Management and other functions.
  • the mobile terminal 1000 includes some functional modules not shown, and details are not described herein again.
  • an embodiment of the present disclosure further provides a mobile terminal, including a processor 1010, a memory 1009, a video data processing program stored on the memory 1009 and operable on the processor 1010, where the video data processing program is
  • the processor 1010 is executed, the processes of the foregoing video data processing method are implemented, and the same technical effects can be achieved. To avoid repetition, details are not described herein again.
  • the embodiment of the present disclosure further provides a computer readable storage medium.
  • the computer readable storage medium stores a video data processing program, where the video data processing program is executed by the processor to implement various processes of the foregoing video data processing method embodiment, and Can achieve the same technical effect, in order to avoid duplication, no longer repeat here.
  • the computer readable storage medium such as a read-only memory (ROM), a random access memory (RAM), a magnetic disk, or an optical disk.
  • the foregoing embodiment method can be implemented by means of software plus a necessary general hardware platform, and of course, can also be through hardware, but in many cases, the former is better.
  • Implementation Based on such understanding, the technical solution of the present disclosure, which is essential or contributes to the related art, may be embodied in the form of a software product stored in a storage medium (such as ROM/RAM, disk, CD-ROM).
  • the instructions include a number of instructions for causing a terminal (which may be a cell phone, computer, server, air conditioner, or network device, etc.) to perform the methods described in various embodiments of the present disclosure.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • User Interface Of Digital Computer (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Telephone Function (AREA)

Abstract

La présente invention concerne un terminal mobile et un procédé de traitement de données vidéo. Le procédé consiste : dans un processus de lecture vidéo, à recevoir une première entrée d'un corps d'exploitation sur une interface de lecture vidéo ; en réponse à la première entrée, à suspendre la lecture d'une vidéo, et à afficher une zone d'édition de texte et une zone d'affichage de texte sur l'interface de lecture de vidéo ; à recevoir des données de texte entrées par un utilisateur dans la zone d'édition de texte ; à afficher les données de texte séparément dans la zone d'édition de texte et la zone d'affichage de texte ; à recevoir une seconde entrée de l'utilisateur ; et en réponse à la seconde entrée, sur la base des données de texte et d'une première image affichée dans l'interface de lecture de vidéo, à distribuer des données cibles dans un mode de traitement de données correspondant à la seconde entrée.
PCT/CN2019/082452 2018-04-13 2019-04-12 Terminal mobile et procédé de traitement de données vidéo WO2019196929A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810332493.7 2018-04-13
CN201810332493.7A CN108737904B (zh) 2018-04-13 2018-04-13 一种视频数据处理方法及移动终端

Publications (1)

Publication Number Publication Date
WO2019196929A1 true WO2019196929A1 (fr) 2019-10-17

Family

ID=63938937

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/082452 WO2019196929A1 (fr) 2018-04-13 2019-04-12 Terminal mobile et procédé de traitement de données vidéo

Country Status (2)

Country Link
CN (1) CN108737904B (fr)
WO (1) WO2019196929A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113630644A (zh) * 2021-06-29 2021-11-09 北京搜狗科技发展有限公司 视频内容编辑器的编辑方法、装置及存储介质
CN114025215A (zh) * 2021-11-04 2022-02-08 深圳传音控股股份有限公司 文件处理方法、移动终端及存储介质

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108737904B (zh) * 2018-04-13 2021-06-22 维沃移动通信有限公司 一种视频数据处理方法及移动终端
CN109379631B (zh) * 2018-12-13 2020-11-24 广州艾美网络科技有限公司 一种通过移动终端编辑视频字幕的方法
CN110062281B (zh) * 2019-05-29 2021-08-24 维沃移动通信有限公司 一种播放进度调节方法及其终端设备
CN112306338A (zh) * 2019-09-04 2021-02-02 北京字节跳动网络技术有限公司 一种视频退出方法、装置和电子设备
CN111400552B (zh) * 2020-03-31 2024-02-27 维沃移动通信有限公司 便签创建方法及电子设备
CN114390358A (zh) * 2020-10-21 2022-04-22 上海哔哩哔哩科技有限公司 弹幕输入方法以及装置
CN112689165B (zh) * 2020-12-18 2023-03-31 中国联合网络通信集团有限公司 视频播放方法及装置

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130263002A1 (en) * 2012-03-30 2013-10-03 Lg Electronics Inc. Mobile terminal
CN104661096A (zh) * 2013-11-21 2015-05-27 深圳市快播科技有限公司 视频弹幕添加方法及装置和视频播放方法及视频播放器
CN104714937A (zh) * 2015-03-30 2015-06-17 北京奇艺世纪科技有限公司 一种评论信息发布方法及装置
CN105872820A (zh) * 2015-12-03 2016-08-17 乐视云计算有限公司 添加视频标签的方法和装置
CN106155514A (zh) * 2015-04-23 2016-11-23 中兴通讯股份有限公司 一种实现触控的方法和装置
CN106507177A (zh) * 2016-11-30 2017-03-15 百度在线网络技术(北京)有限公司 用于生成弹幕的方法和装置
CN107734373A (zh) * 2017-10-12 2018-02-23 网易(杭州)网络有限公司 弹幕发送方法及装置、存储介质、电子设备
CN108737904A (zh) * 2018-04-13 2018-11-02 维沃移动通信有限公司 一种视频数据处理方法及移动终端

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8566353B2 (en) * 2008-06-03 2013-10-22 Google Inc. Web-based system for collaborative generation of interactive videos
WO2015163555A1 (fr) * 2014-04-22 2015-10-29 주식회사 뱁션 Système et procédé d'insertion de sous-titres
CN104935980B (zh) * 2015-05-04 2019-03-15 腾讯科技(北京)有限公司 互动信息处理方法、客户端及服务平台
CN104980809B (zh) * 2015-06-30 2019-03-12 北京奇艺世纪科技有限公司 一种弹幕处理方法和装置
CN106648424B (zh) * 2016-11-23 2020-11-24 广州华多网络科技有限公司 截图方法及装置
CN107707953B (zh) * 2017-10-20 2020-06-05 维沃移动通信有限公司 一种资源数据展示方法及移动终端

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130263002A1 (en) * 2012-03-30 2013-10-03 Lg Electronics Inc. Mobile terminal
CN104661096A (zh) * 2013-11-21 2015-05-27 深圳市快播科技有限公司 视频弹幕添加方法及装置和视频播放方法及视频播放器
CN104714937A (zh) * 2015-03-30 2015-06-17 北京奇艺世纪科技有限公司 一种评论信息发布方法及装置
CN106155514A (zh) * 2015-04-23 2016-11-23 中兴通讯股份有限公司 一种实现触控的方法和装置
CN105872820A (zh) * 2015-12-03 2016-08-17 乐视云计算有限公司 添加视频标签的方法和装置
CN106507177A (zh) * 2016-11-30 2017-03-15 百度在线网络技术(北京)有限公司 用于生成弹幕的方法和装置
CN107734373A (zh) * 2017-10-12 2018-02-23 网易(杭州)网络有限公司 弹幕发送方法及装置、存储介质、电子设备
CN108737904A (zh) * 2018-04-13 2018-11-02 维沃移动通信有限公司 一种视频数据处理方法及移动终端

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113630644A (zh) * 2021-06-29 2021-11-09 北京搜狗科技发展有限公司 视频内容编辑器的编辑方法、装置及存储介质
CN113630644B (zh) * 2021-06-29 2024-01-30 北京搜狗科技发展有限公司 视频内容编辑器的编辑方法、装置及存储介质
CN114025215A (zh) * 2021-11-04 2022-02-08 深圳传音控股股份有限公司 文件处理方法、移动终端及存储介质

Also Published As

Publication number Publication date
CN108737904B (zh) 2021-06-22
CN108737904A (zh) 2018-11-02

Similar Documents

Publication Publication Date Title
WO2021098678A1 (fr) Procédé de commande de vidéocapture d'écran et dispositif électronique
WO2019196929A1 (fr) Terminal mobile et procédé de traitement de données vidéo
WO2021036536A1 (fr) Procédé de filmage vidéo et dispositif électronique
WO2019154181A1 (fr) Procédé de commande d'affichage et terminal mobile
WO2019228294A1 (fr) Procédé de partage d'objet et terminal mobile
WO2019228293A1 (fr) Procédé de commande d'affichage et terminal
WO2019223494A1 (fr) Procédé de capture d'écran et terminal mobile
WO2020042890A1 (fr) Procédé de traitement vidéo, terminal et support d'informations lisible par ordinateur
WO2019174629A1 (fr) Procédé de traitement d'image et terminal à écran flexible
WO2019196691A1 (fr) Procédé d'affichage d'interface de clavier et terminal mobile
WO2021017776A1 (fr) Procédé de traitement d'informations et terminal
WO2021083168A1 (fr) Procédé de partage de vidéos et dispositif électronique
WO2021104230A1 (fr) Procédé de synchronisation et dispositif électronique
WO2020238449A1 (fr) Procédé de traitement de messages de notification et terminal
WO2019120192A1 (fr) Procédé de modification de texte, et dispositif mobile
WO2020233323A1 (fr) Procédé de commande d'affichage, dispositif terminal, et support de stockage lisible par ordinateur
WO2019085774A1 (fr) Procédé de commande d'application et terminal mobile
WO2019114522A1 (fr) Procédé de commande d'écran, appareil de commande d'écran et terminal mobile
WO2021068885A1 (fr) Procédé de commande et dispositif électronique
WO2021004426A1 (fr) Procédé de sélection de contenu et terminal
CN108124059B (zh) 一种录音方法及移动终端
WO2020228538A1 (fr) Procédé de capture d'écran et terminal mobile
WO2021104251A1 (fr) Procédé de commande et premier dispositif électronique
WO2020024770A1 (fr) Procédé pour déterminer un objet de communication, et terminal mobile
WO2019154360A1 (fr) Procédé de commutation d'interface et terminal mobile

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19785820

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19785820

Country of ref document: EP

Kind code of ref document: A1