CN109451178B - Video playing method and terminal - Google Patents

Video playing method and terminal Download PDF

Info

Publication number
CN109451178B
CN109451178B CN201811611824.7A CN201811611824A CN109451178B CN 109451178 B CN109451178 B CN 109451178B CN 201811611824 A CN201811611824 A CN 201811611824A CN 109451178 B CN109451178 B CN 109451178B
Authority
CN
China
Prior art keywords
video
playing
screen unit
screen
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811611824.7A
Other languages
Chinese (zh)
Other versions
CN109451178A (en
Inventor
徐青昆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201811611824.7A priority Critical patent/CN109451178B/en
Publication of CN109451178A publication Critical patent/CN109451178A/en
Application granted granted Critical
Publication of CN109451178B publication Critical patent/CN109451178B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72439User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72466User interfaces specially adapted for cordless or mobile telephones with selection means, e.g. keys, having functions defined by the mode or the status of the device

Abstract

The invention provides a video playing method and a terminal, wherein the method comprises the following steps: receiving a first input, wherein the first input is used for starting a multi-screen linkage shooting function; determining a video source to be played by each screen unit of at least two screen units adopted by multi-screen linkage shooting, wherein the video source comprises at least one of a reference video file and a video shot by a camera of the terminal in real time; receiving a second input, wherein the second input is a playing command for playing a reference video file or a shooting command for shooting a video; in response to the second input, simultaneously playing videos of corresponding video sources on the at least two screen units; according to the embodiment of the invention, the at least two screen units are awakened at the same time, and the corresponding video sources are played on the awakened at least two screen units at the same time, so that the effect comparison between action following shooting and shooting after the action following shooting can be realized more easily by an auxiliary user.

Description

Video playing method and terminal
Technical Field
The embodiment of the invention relates to the technical field of communication, in particular to a video playing method and a terminal.
Background
With the advance of the intelligent terminal industry, the intelligent terminal camera shooting technology develops rapidly, hardware technologies such as a front camera and a double camera appear in succession, and the requirements of users on self-shooting quality and diversity and interesting playing methods are further improved while the more and more strong self-shooting (shooting and shooting) requirements of the users are met.
When a user carries out the self-shooting of specific classical plot simulation and exaggerated motion simulation, the conventional tablet phone is used, plots and motions in video short films need to be memorized for many times, the motions in the plots at a high probability in the self-shooting process cannot be accurately restored, or a satisfactory interesting self-shooting video can be shot by practicing for many times, so that the shooting of the intelligent terminal in the scene is undoubtedly insufficient.
Meanwhile, when a user watches a self-shot video, the self-shot video cannot be well compared with an original video, and the problems of timely finding and correcting self action errors and the like are not facilitated.
Disclosure of Invention
The invention provides a video playing method and a terminal in real time, and aims to overcome the defect that an intelligent terminal simulates a shooting scene in the prior art.
In order to solve the technical problem, the invention is realized as follows: a video playing method is applied to a terminal, the terminal at least comprises two screen units, and the method comprises the following steps:
receiving a first input, wherein the first input is used for starting a multi-screen linkage shooting function;
determining a video source to be played by each screen unit of at least two screen units adopted by multi-screen linkage shooting, wherein the video source comprises at least one of a reference video file and a video shot by a camera of the terminal in real time;
receiving a second input, wherein the second input is a playing command for playing a reference video file or a shooting command for shooting a video;
in response to the second input, video of the corresponding video source is played on the at least two screen units simultaneously.
The embodiment of the invention also provides a terminal, which at least comprises two screen units, and the terminal comprises:
the system comprises a starting module, a display module and a display module, wherein the starting module is used for receiving a first input, and the first input is used for starting a multi-screen linkage shooting function;
the device comprises a determining module, a display module and a display module, wherein the determining module is used for determining a video source to be played by each screen unit of at least two screen units adopted by multi-screen linkage shooting, and the video source comprises at least one of a reference video file and a video shot by a camera of the terminal in real time;
the receiving module is used for receiving a second input, wherein the second input is a playing command for playing a reference video file or a shooting command for shooting a video;
and the playing module is used for responding to the second input and simultaneously playing the videos of the corresponding video sources on the at least two screen units.
An embodiment of the present invention provides a terminal, including a processor, a memory, and a computer program stored on the memory and capable of running on the processor, where the computer program, when executed by the processor, implements the steps of the video playing method described above.
An embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the steps of the video playing method are implemented as described above.
In the embodiment of the invention, the auxiliary user can more easily perform action following shooting and effect comparison after shooting by simultaneously waking up at least two screen units and simultaneously playing the corresponding video sources on the woken up at least two screen units.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive labor.
Fig. 1 is a flowchart illustrating steps of a video playing method according to an embodiment of the present invention;
fig. 2 is a schematic view illustrating a terminal playing in a video playing method according to an embodiment of the present invention;
fig. 3 is a second schematic view illustrating a playing of a terminal in the video playing method according to the embodiment of the present invention;
fig. 4 is a schematic structural diagram of a terminal according to an embodiment of the present invention;
fig. 5 is a second schematic structural diagram of a terminal according to an embodiment of the present invention;
fig. 6 is a third schematic structural diagram of a terminal according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1, an embodiment of the present invention provides a video playing method, which is applied to a terminal, where the terminal includes at least two screen units (optionally, at least two screen units are foldable screen units), and the method includes:
step 101, receiving a first input, where the first input is used to start a multi-screen linkage shooting function. The first input may wake up at least two screen units.
In this step, a user may trigger a first input (e.g., select a multi-screen linked shooting function) in the camera application, and then the terminal receives the first input, and in response to the first input, the terminal starts the multi-screen linked shooting function and wakes up at least two screen units, where the woken-up screen units are used for performing multi-screen linked shooting. Which of the screen units to wake up may be selected by the user, or may be preset, and is not limited herein.
Step 102, determining a video source to be played by each screen unit of at least two screen units adopted by multi-screen linkage shooting, wherein the video source comprises at least one of a reference video file and a video shot by a camera of the terminal in real time.
In this step, the reference video file may be a local video file or a networked video file. Optionally, the video source to be played by each screen unit may be operated by the user on the corresponding screen unit. For example, if at least two awakened screen units are an a screen unit and a B screen unit, respectively, an operation is performed on the a screen unit to select a video source to be played corresponding to the a screen unit, and an operation is performed on the B screen unit to select a video source to be played corresponding to the B screen unit.
Step 103, receiving a second input, where the second input is a play command for playing a reference video file or a shooting command for shooting a video.
In this step, the playing command for playing the reference video file may be a playing command acting on a screen unit for playing the reference video file; for example, a play command triggered by triggering a play button; also, the photographing command to photograph the video may be a photographing command on a screen unit acting on the video played by the camera in real time, for example, a photographing command triggered by triggering a photographing key.
And 104, responding to the second input, and simultaneously playing videos of the corresponding video sources on the at least two screen units.
In this step, in order to solve the synchronization problem, the video corresponding to the video source is played on the at least two screen units simultaneously in response to any one of the play command or the shooting command, and the user does not need to click on the two screen units simultaneously.
As one embodiment, step 102 includes:
determining that a first screen unit of the at least two screen units plays a reference video file, and determining that a second screen unit of the at least two screen units plays a video shot by a camera of the terminal in real time;
accordingly, step 104 includes:
and shooting a video through a camera of the terminal and playing the video shot by the camera in real time on the second screen unit while the first screen unit plays the reference video file.
The embodiment of the invention adopts a simple and visual folding screen mode to simultaneously display the reference video file (which can also be an original video file or a simulated video file) and the video shot by the camera in real time, namely, the first screen unit displays the reference video file, and the second screen unit displays the video shot by the camera, so as to achieve synchronous shooting and synchronous playing.
In order to solve the problem of synchronization of playing of a reference video and shooting of the video, a user clicks a playing key on a first screen unit before video shooting is about to start, and a camera of a terminal synchronously shoots the video and plays the video on a second screen unit in a linkage manner; or the user clicks a shooting key on the second screen unit before the video shooting is about to start, and the reference video file on the first screen unit is synchronously played in a linkage manner.
It should be noted that the video shot by the camera of the terminal in real time includes any one of the following:
a video shot in real time by a first camera arranged on a first screen unit;
the video shot in real time by a second camera arranged on a second screen unit;
the first camera arranged on the first screen unit and the second camera arranged on the second screen unit jointly shoot videos in real time.
In the above embodiment of the present invention, the displayed shot picture may call the cameras on different screen units to perform shooting, for example, when the user is accustomed to watching the reference video content, the camera of the first screen unit may be selected, so that the shot video is focused in front of the lens, and the "strong lens sense" of the user is highlighted; for another example, the user can fold the screen to a certain angle (for example, about 120 °) and select to call the self-timer camera, and still complete shooting along with the original picture, so that the same type of video with a side view angle can be shot, and the shooting interest is increased.
In the process of video shooting, a user can simultaneously look at the reference video file played on the first screen unit and the own action of the user shot by the camera on the second screen unit so as to assist in completing shooting, shooting contents do not need to be memorized repeatedly, and the own action can be adjusted in real time according to the real-time reference video file.
Further, in the above embodiment of the present invention, when the first screen unit plays the reference video file, shooting a video by a camera of the terminal and playing a video shot by the camera in real time on the second screen unit includes:
and playing the image of the reference video file through a first screen unit, playing the video image shot by the camera in real time through a second screen unit, and playing the audio of the reference video file through an audio playing device of the terminal.
It should be noted that, in order to avoid mutual interference of multiple audios, the video playing method according to the embodiment of the present invention only uses one audio playing device, that is, the first screen unit and the second screen unit respectively play different video images, but use the audio corresponding to the same audio source. In general, in the case where the first screen unit plays an image of a reference video file and the second screen unit plays a video image photographed in real time, an audio playing device of the terminal plays audio of the reference video file, for example, background music of the reference video file, and the like.
Further, in the above embodiment of the present invention, the method further includes:
and when the reference video file on the first screen unit is played, synchronously finishing shooting of a camera of the terminal.
When the video shooting is about to be finished and the playing of the reference video file on the first screen unit is finished, the shooting of the camera is automatically and synchronously finished so as to achieve the synchronous effect and avoid the operation of editing unnecessary videos by the user in the later period.
Further, in the above embodiment of the present invention, after the user finishes video shooting, videos shot by the camera are cached, and the user can click the video playing key on the first screen unit or the second screen unit, and simultaneously view the reference video file and the shot video, thereby realizing one-key pause, one-key play, and one-key end of the reference video file and the shot video.
In the embodiment of the invention, a user can conveniently compare the original video with the self-timer video, the completion degree of the self-made content is compared, and the self-made content can be played in a linkage manner, so that the operation of simultaneously clicking a plurality of start buttons does not exist, and the operation pain points and the pain points which are asynchronous in picture and are caused by asynchronous clicking are reduced.
As shown in fig. 2, the video playing method provided in the embodiment of the present invention specifically includes:
selecting a linked shooting mode in the camera, simultaneously awakening two screen units (an A screen and a B screen respectively) to enter a double-screen mode, and selecting the A screen to display a reference video picture and the B screen to display a shooting picture;
operating on a playing screen A of a reference video picture, and selecting video content to be imitated (both local video and networking video can be selected);
the user selects to call the A screen camera and the B screen camera or simultaneously calls the AB screen camera according to own habits and shooting preferences, and shooting is carried out. Clicking a play button on a video, starting to enter countdown 3..2..1.. GO, and setting countdown duration by a user to give sufficient preparation time before shooting;
and after the countdown is finished, the A screen starts to play the reference video picture, the B screen starts to play the shot picture, the B screen only keeps the video shot picture, and the video sound uses the sound of the reference video.
When the reference video of the A screen starts to be played, the self-timer on the B screen is synchronously carried out, the capabilities of playing and shooting the video in a one-key linkage mode are achieved, and a user does not need to click on the two screens at the same time (the clicking operation mode on the two screens has the problems that the operation is difficult and the pictures and the sounds are not synchronous); when the reference video on the screen A is played, the shooting on the screen B is automatically paused, the linkage when the shooting is finished is achieved, and the disturbance of invalid content at the tail end of the subsequent editing and shooting of a user is avoided.
In summary, the above-mentioned embodiment of the present invention adopts a linked shooting manner, where one screen unit displays the reference video, the other screen unit displays the shooting picture, and one key starts and ends, so as to solve the problems that the user forgets to move and shooting is difficult to synchronize during shooting the imitation video, and there are pain spots of a small tail of several seconds after shooting ends.
As another example, step 102 includes:
determining that a first screen unit of the at least two screen units plays a video shot in real time by a first camera arranged on the first screen unit, and determining that a second screen unit of the at least two screen units plays a video shot in real time by a second camera arranged on the second screen unit;
step 104 comprises:
the first camera and the second camera shoot videos simultaneously, the videos shot by the first camera in real time are played on the first screen unit, and the videos shot by the second camera in real time are played on the second screen unit.
In the embodiment of the invention, the reference video file can be not selected to be loaded, the first screen unit, the first camera, the second screen unit and the second camera are started at the same time, and shooting can be started at the same time, and a multi-angle interesting shooting playing method can also be carried out, for example, two screen units respectively display shooting pictures at different angles; for another example, one may select a normal focal length, take a full face, one may select a zoom-in focal length, take facial details; for another example, different special effect shooting and different filter shooting can be added on different screen units, and the interesting playing method of shooting is further increased.
Optionally, playing the video shot by the second camera in real time on the second screen unit includes:
and playing the video image shot by the first camera in real time through the first screen unit, playing the video image shot by the second camera in real time through the second screen unit, and playing preset audio through an audio playing device of the terminal.
Due to the lens switching selection capability, different angle pictures of the same shot video can be selected on the first screen unit and the second screen unit at the same time, the shooting interest is increased, and the problem of asynchronism caused by two times of shooting does not exist.
It should be noted that, in order to avoid mutual interference of multiple audios, the video playing method according to the embodiment of the present invention only uses one audio playing device, that is, the first screen unit and the second screen unit respectively play different video images, but use the audio corresponding to the same audio source.
As shown in fig. 3, the video playing method provided in the embodiment of the present invention specifically includes:
selecting a linkage mode in the camera, simultaneously awakening two screen units (an A screen and a B screen respectively) and entering a double-screen mode;
respectively selecting playing contents on the screen A and the screen B as a video shot by a lens A (a camera on the screen A) and a video shot by a lens B (a camera on the screen B);
the content of two videos shot by the lens A and the lens B can be synchronously played by clicking on the screen A or the screen B, and the synchronism and the shooting effect of the content of the screen A and the content of the screen B can be directly compared due to the fact that the same set of background music is used.
In summary, in the embodiments of the present invention, by waking up at least two screen units simultaneously and playing the corresponding video sources on the at least two wakened screen units simultaneously, it is possible to assist the user to perform action following shooting and effect comparison after shooting more easily.
As shown in fig. 4, an embodiment of the present invention further provides a terminal 400, where the terminal includes at least two screen units, and the terminal includes:
the system comprises a starting module 401, a first input module, a second input module and a display module, wherein the starting module is used for receiving a first input, and the first input is used for starting a multi-screen linkage shooting function;
a determining module 402, configured to determine a video source to be played in each of at least two screen units used in multi-screen linked shooting, where the video source includes at least one of a reference video file and a video shot by a camera of the terminal in real time;
a receiving module 403, configured to receive a second input, where the second input is a play command for playing a reference video file or a shooting command for shooting a video;
a playing module 404, configured to play videos of corresponding video sources on the at least two screen units simultaneously in response to the second input.
Optionally, in the foregoing embodiment of the present invention, the determining module 402 includes:
the first determining sub-module 4021 is configured to determine that a reference video file is played in a first screen unit of the at least two screen units, and determine that a video shot by a camera of the terminal in real time is played in a second screen unit of the at least two screen units;
the playing module 404 includes:
the first playing sub-module 4041 is configured to shoot a video through a camera of the terminal and play the video shot by the camera in real time on the second screen unit while the first screen unit plays the reference video file.
Optionally, in the embodiment of the present invention, the video shot by the camera of the terminal in real time includes any one of the following:
a video shot in real time by a first camera arranged on a first screen unit;
the video shot in real time by a second camera arranged on a second screen unit;
the first camera arranged on the first screen unit and the second camera arranged on the second screen unit jointly shoot videos in real time.
Optionally, in the foregoing embodiment of the present invention, the first playing sub-module 4041 includes:
and the first playing unit is used for playing the image of the reference video file through the first screen unit, playing the video image shot by the camera in real time through the second screen unit, and playing the audio of the reference video file through an audio playing device of the terminal.
Optionally, in the foregoing embodiment of the present invention, the terminal further includes:
and the first ending module is used for synchronously ending the shooting of the camera of the terminal when the playing of the reference video file on the first screen unit is finished.
Optionally, in the foregoing embodiment of the present invention, the determining module 402 includes:
the second determining sub-module 4022 is configured to determine that a first screen unit of the at least two screen units plays a video captured in real time by a first camera disposed on the first screen unit, and determine that a second screen unit of the at least two screen units plays a video captured in real time by a second camera disposed on the second screen unit;
the playing module 404 includes:
the second playing submodule 4042 is configured to capture videos through the first camera and the second camera simultaneously, play the video captured by the first camera in real time on the first screen unit, and play the video captured by the second camera in real time on the second screen unit.
Optionally, in the foregoing embodiment of the present invention, the second playing sub-module 4042 includes:
and the second playing unit is used for playing the video image shot by the first camera in real time through the first screen unit, playing the video image shot by the second camera in real time through the second screen unit, and playing preset audio through an audio playing device of the terminal.
The terminal provided by the embodiment of the present invention can implement each process implemented by the terminal in the method embodiments of fig. 1 to fig. 3, and is not described herein again to avoid repetition.
In summary, in the embodiments of the present invention, by waking up at least two screen units simultaneously and playing the corresponding video sources on the at least two wakened screen units simultaneously, it is possible to assist the user to perform action following shooting and effect comparison after shooting more easily.
It should be noted that, the terminal provided in the embodiments of the present invention is a terminal capable of executing the video playing method, and all embodiments of the video playing method are applicable to the terminal, and can achieve the same or similar beneficial effects.
Fig. 6 is a schematic diagram of a hardware structure of a terminal for implementing various embodiments of the present invention, where the terminal includes at least two screen units and the terminal 600 includes, but is not limited to: a radio frequency unit 601, a network module 602, an audio output unit 603, an input unit 604, a sensor 605, a display unit 606, a user input unit 607, an interface unit 608, a memory 609, a processor 610, and a power supply 611. Those skilled in the art will appreciate that the terminal configuration shown in fig. 6 is not intended to be limiting, and that the terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The user input unit 607 is configured to receive a first input, where the first input is used to start a multi-screen linkage shooting function;
the processor 610 is configured to determine a video source to be played in each of at least two screen units used for multi-screen linked shooting, where the video source includes at least one of a reference video file and a video shot by a camera of the terminal in real time;
the user input unit 607 is also for: receiving a second input, wherein the second input is a playing command for playing a reference video file or a shooting command for shooting a video;
the processor 610 also controls the display unit 606 to play video of the corresponding video source on the at least two screen units simultaneously in response to the second input.
In summary, in the embodiments of the present invention, by waking up at least two screen units simultaneously and playing the corresponding video sources on the at least two wakened screen units simultaneously, it is possible to assist the user to perform action following shooting and effect comparison after shooting more easily.
It should be noted that, the terminal provided in the embodiments of the present invention is a terminal capable of executing the video playing method, and all embodiments of the video playing method are applicable to the terminal, and can achieve the same or similar beneficial effects.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 601 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 610; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 601 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. Further, the radio frequency unit 601 may also communicate with a network and other devices through a wireless communication system.
The terminal provides wireless broadband internet access to the user through the network module 602, such as helping the user send and receive e-mails, browse web pages, and access streaming media.
The audio output unit 603 may convert audio data received by the radio frequency unit 601 or the network module 602 or stored in the memory 609 into an audio signal and output as sound. Also, the audio output unit 603 can also provide audio output related to a specific function performed by the terminal 600 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 603 includes a speaker, a buzzer, a receiver, and the like.
The input unit 604 is used to receive audio or video signals. The input Unit 604 may include a Graphics Processing Unit (GPU) 6041 and a microphone 6042, and the Graphics processor 6041 processes image data of a still picture or video obtained by an image capturing apparatus (such as a camera) in a video capture mode or an image capture mode. The processed image frames may be displayed on the display unit 606. The image frames processed by the graphic processor 6041 may be stored in the memory 609 (or other storage medium) or transmitted via the radio frequency unit 601 or the network module 602. The microphone 6042 can receive sound, and can process such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 601 in case of the phone call mode.
The terminal 600 also includes at least one sensor 605, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 6061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 6061 and/or the backlight when the terminal 600 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the terminal posture (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (such as pedometer, tapping), and the like; the sensors 605 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 606 is used to display information input by the user or information provided to the user. The Display unit 106 may include a Display panel 6061, and the Display panel 6061 may be configured by a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 607 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal. Specifically, the user input unit 607 includes a touch panel 6071 and other input devices 6072. Touch panel 6071, also referred to as a touch screen, may collect touch operations by a user on or near it (e.g., operations by a user on or near touch panel 6071 using a finger, stylus, or any suitable object or accessory). The touch panel 6071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 610, receives a command from the processor 610, and executes the command. In addition, the touch panel 6071 can be implemented by various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The user input unit 607 may include other input devices 6072 in addition to the touch panel 6071. Specifically, the other input devices 6072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a track ball, a mouse, and a joystick, which are not described herein again.
Further, the touch panel 6071 can be overlaid on the display panel 6061, and when the touch panel 6071 detects a touch operation on or near the touch panel 6071, the touch operation is transmitted to the processor 610 to determine the type of the touch event, and then the processor 610 provides a corresponding visual output on the display panel 6061 according to the type of the touch event. Although in fig. 6, the touch panel 6071 and the display panel 6061 are two independent components to realize the input and output functions of the terminal, in some embodiments, the touch panel 6071 and the display panel 6061 may be integrated to realize the input and output functions of the terminal, and this is not limited here.
The interface unit 608 is an interface for connecting an external device to the terminal 600. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 608 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the terminal 600 or may be used to transmit data between the terminal 600 and an external device.
The memory 609 may be used to store software programs as well as various data. The memory 609 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 609 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 610 is a control center of the terminal, connects various parts of the entire terminal using various interfaces and lines, and performs various functions of the terminal and processes data by operating or executing software programs and/or modules stored in the memory 609 and calling data stored in the memory 609, thereby performing overall monitoring of the terminal. Processor 610 may include one or more processing units; preferably, the processor 610 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 610.
The terminal 600 may further include a power supply 611 (e.g., a battery) for supplying power to the various components, and preferably, the power supply 611 is logically connected to the processor 610 via a power management system, so that functions of managing charging, discharging, and power consumption are performed via the power management system.
In addition, the terminal 600 includes some functional modules that are not shown, and are not described in detail herein.
Preferably, an embodiment of the present invention further provides a terminal, which includes a processor, a memory, and a computer program stored in the memory and capable of running on the processor, and when the computer program is executed by the processor, the computer program implements each process of the video playing method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the video playing method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (12)

1. A video playing method is applied to a terminal, the terminal at least comprises two screen units, and the method is characterized by comprising the following steps:
receiving a first input, wherein the first input is used for starting a multi-screen linkage shooting function;
determining a video source to be played by each screen unit of at least two screen units adopted by multi-screen linkage shooting, wherein the video source comprises at least one of a reference video file and a video shot by a camera of the terminal in real time;
receiving a second input, wherein the second input is a playing command for playing a reference video file or a shooting command for shooting a video;
in response to the second input, simultaneously playing videos of corresponding video sources on the at least two screen units;
before the video shooting is about to start, a user clicks a play key on a first screen unit, and a camera of the terminal synchronously shoots a video and plays the video on a second screen unit in a linkage manner; or before the video shooting is about to start, the user clicks a shooting key on the second screen unit, and the reference video file on the first screen unit is synchronously played in a linkage manner;
after the user finishes video shooting, caching the video shot by the camera, clicking a video playing key on the first screen unit or the second screen unit, and simultaneously watching a reference video file and a shot video;
the first screen unit and the second screen unit respectively play different video images and use audio corresponding to the same audio source;
confirm that many screen linkage shoot each screen unit in at least two screen units that adopt and wait the video source of broadcast, include:
determining that a reference video file is played in a first screen unit of the at least two screen units, and determining that a video shot by a camera of the terminal in real time is played in a second screen unit of the at least two screen units;
the simultaneously playing videos corresponding to video sources on the at least two screen units comprises:
shooting a video through a camera of the terminal and playing the video shot by the camera in real time on the second screen unit while the first screen unit plays the reference video file;
the video shot by the camera of the terminal in real time comprises:
the first camera arranged on the first screen unit shoots videos in real time.
2. The method according to claim 1, wherein the capturing a video by a camera of the terminal and playing a video captured by the camera in real time on the second screen unit while the reference video file is played on the first screen unit comprises:
and playing the image of the reference video file through a first screen unit, playing the video image shot by the camera in real time through a second screen unit, and playing the audio of the reference video file through an audio playing device of the terminal.
3. The method of claim 1, further comprising:
and when the reference video file on the first screen unit is played, synchronously finishing shooting of a camera of the terminal.
4. The method according to claim 1, wherein the determining a video source to be played by each of at least two screen units adopted by the multi-screen linked shooting comprises:
determining that a first screen unit of the at least two screen units plays a video shot in real time by a first camera arranged on the first screen unit, and determining that a second screen unit of the at least two screen units plays a video shot in real time by a second camera arranged on the second screen unit;
the simultaneously playing videos corresponding to video sources on the at least two screen units comprises:
the first camera and the second camera shoot videos simultaneously, the videos shot by the first camera in real time are played on the first screen unit, and the videos shot by the second camera in real time are played on the second screen unit.
5. The method of claim 4, wherein playing the video captured by the first camera in real time on the first screen unit and playing the video captured by the second camera in real time on the second screen unit comprises:
and playing the video image shot by the first camera in real time through the first screen unit, playing the video image shot by the second camera in real time through the second screen unit, and playing preset audio through an audio playing device of the terminal.
6. A terminal comprising at least two screen units, the terminal comprising:
the system comprises a starting module, a display module and a display module, wherein the starting module is used for receiving a first input, and the first input is used for starting a multi-screen linkage shooting function;
the device comprises a determining module, a display module and a display module, wherein the determining module is used for determining a video source to be played by each screen unit of at least two screen units adopted by multi-screen linkage shooting, and the video source comprises at least one of a reference video file and a video shot by a camera of the terminal in real time;
the receiving module is used for receiving a second input, wherein the second input is a playing command for playing a reference video file or a shooting command for shooting a video;
a playing module for simultaneously playing videos of corresponding video sources on the at least two screen units in response to the second input;
before the video shooting is about to start, a user clicks a play key on a first screen unit, and a camera of the terminal synchronously shoots a video and plays the video on a second screen unit in a linkage manner; or before the video shooting is about to start, the user clicks a shooting key on the second screen unit, and the reference video file on the first screen unit is synchronously played in a linkage manner;
after the user finishes video shooting, caching the video shot by the camera, clicking a video playing key on the first screen unit or the second screen unit, and simultaneously watching a reference video file and a shot video;
the first screen unit and the second screen unit respectively play different video images and use audio corresponding to the same audio source;
the determining module comprises:
the first determining submodule is used for determining that a reference video file is played in a first screen unit of the at least two screen units, and determining that a video shot by a camera of the terminal in real time is played in a second screen unit of the at least two screen units;
the playing module comprises:
the first playing submodule is used for shooting videos through a camera of the terminal and playing the videos shot by the camera in real time on the second screen unit while the first screen unit plays the reference video file;
the video shot by the camera of the terminal in real time comprises:
the first camera arranged on the first screen unit shoots videos in real time.
7. The terminal of claim 6, wherein the first play sub-module comprises:
and the first playing unit is used for playing the image of the reference video file through the first screen unit, playing the video image shot by the camera in real time through the second screen unit, and playing the audio of the reference video file through an audio playing device of the terminal.
8. The terminal of claim 6, further comprising:
and the first ending module is used for synchronously ending the shooting of the camera of the terminal when the playing of the reference video file on the first screen unit is finished.
9. The terminal of claim 6, wherein the determining module comprises:
the second determining submodule is used for determining that a first screen unit in the at least two screen units plays the video shot in real time by the first camera arranged on the first screen unit, and determining that a second screen unit in the at least two screen units plays the video shot in real time by the second camera arranged on the second screen unit;
the playing module comprises:
and the second playing submodule is used for simultaneously shooting videos through the first camera and the second camera, playing the videos shot by the first camera in real time on the first screen unit, and playing the videos shot by the second camera in real time on the second screen unit.
10. The terminal of claim 9, wherein the second sub-module comprises:
and the second playing unit is used for playing the video image shot by the first camera in real time through the first screen unit, playing the video image shot by the second camera in real time through the second screen unit, and playing preset audio through an audio playing device of the terminal.
11. A terminal, characterized in that it comprises a processor, a memory and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the video playback method according to any one of claims 1 to 5.
12. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the video playback method according to one of claims 1 to 5.
CN201811611824.7A 2018-12-27 2018-12-27 Video playing method and terminal Active CN109451178B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811611824.7A CN109451178B (en) 2018-12-27 2018-12-27 Video playing method and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811611824.7A CN109451178B (en) 2018-12-27 2018-12-27 Video playing method and terminal

Publications (2)

Publication Number Publication Date
CN109451178A CN109451178A (en) 2019-03-08
CN109451178B true CN109451178B (en) 2021-03-12

Family

ID=65538555

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811611824.7A Active CN109451178B (en) 2018-12-27 2018-12-27 Video playing method and terminal

Country Status (1)

Country Link
CN (1) CN109451178B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109951733B (en) * 2019-04-18 2021-10-22 北京小米移动软件有限公司 Video playing method, device, equipment and readable storage medium
CN109922271A (en) * 2019-04-18 2019-06-21 珠海格力电器股份有限公司 A kind of mobile terminal and its photographic method based on Folding screen
CN116074564A (en) * 2019-08-18 2023-05-05 聚好看科技股份有限公司 Interface display method and display device
CN114915745B (en) * 2021-02-07 2023-11-03 华为技术有限公司 Multi-view video recording method and device and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016021744A1 (en) * 2014-08-04 2016-02-11 엘지전자(주) Mobile terminal and control method therefor
CN105898133A (en) * 2015-08-19 2016-08-24 乐视网信息技术(北京)股份有限公司 Video shooting method and device
CN107040719A (en) * 2017-03-21 2017-08-11 宇龙计算机通信科技(深圳)有限公司 Filming control method and imaging control device based on double screen terminal
CN107770312A (en) * 2017-11-07 2018-03-06 广东欧珀移动通信有限公司 Method for information display, device and terminal
CN108566519A (en) * 2018-04-28 2018-09-21 腾讯科技(深圳)有限公司 Video creating method, device, terminal and storage medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101816168B1 (en) * 2011-09-08 2018-01-09 삼성전자 주식회사 Apparatus and contents playback method thereof
CN106792080A (en) * 2016-12-07 2017-05-31 北京小米移动软件有限公司 Video broadcasting method and device
CN107368150A (en) * 2017-06-30 2017-11-21 维沃移动通信有限公司 A kind of photographic method and mobile terminal

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016021744A1 (en) * 2014-08-04 2016-02-11 엘지전자(주) Mobile terminal and control method therefor
CN105898133A (en) * 2015-08-19 2016-08-24 乐视网信息技术(北京)股份有限公司 Video shooting method and device
CN107040719A (en) * 2017-03-21 2017-08-11 宇龙计算机通信科技(深圳)有限公司 Filming control method and imaging control device based on double screen terminal
CN107770312A (en) * 2017-11-07 2018-03-06 广东欧珀移动通信有限公司 Method for information display, device and terminal
CN108566519A (en) * 2018-04-28 2018-09-21 腾讯科技(深圳)有限公司 Video creating method, device, terminal and storage medium

Also Published As

Publication number Publication date
CN109451178A (en) 2019-03-08

Similar Documents

Publication Publication Date Title
CN108848308B (en) Shooting method and mobile terminal
WO2021036536A1 (en) Video photographing method and electronic device
CN109361869B (en) Shooting method and terminal
CN109451178B (en) Video playing method and terminal
WO2019174628A1 (en) Photographing method and mobile terminal
CN107786827B (en) Video shooting method, video playing method and device and mobile terminal
CN108881733B (en) Panoramic shooting method and mobile terminal
WO2020042890A1 (en) Video processing method, terminal, and computer readable storage medium
WO2019196929A1 (en) Video data processing method and mobile terminal
CN110602401A (en) Photographing method and terminal
CN111050070B (en) Video shooting method and device, electronic equipment and medium
CN111031108B (en) Synchronization method and electronic equipment
CN108449541B (en) Panoramic image shooting method and mobile terminal
CN109102555B (en) Image editing method and terminal
CN108924412B (en) Shooting method and terminal equipment
CN111290810B (en) Image display method and electronic equipment
CN111669503A (en) Photographing method and device, electronic equipment and medium
CN109688253B (en) Shooting method and terminal
CN111597370B (en) Shooting method and electronic equipment
CN109922294B (en) Video processing method and mobile terminal
CN110798621A (en) Image processing method and electronic equipment
KR20220005087A (en) Filming method and terminal
CN111246102A (en) Shooting method, shooting device, electronic equipment and storage medium
CN108174110B (en) Photographing method and flexible screen terminal
CN107959755B (en) Photographing method, mobile terminal and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant