CN115665459A - Method for playing multi-machine live program - Google Patents
Method for playing multi-machine live program Download PDFInfo
- Publication number
- CN115665459A CN115665459A CN202211106007.2A CN202211106007A CN115665459A CN 115665459 A CN115665459 A CN 115665459A CN 202211106007 A CN202211106007 A CN 202211106007A CN 115665459 A CN115665459 A CN 115665459A
- Authority
- CN
- China
- Prior art keywords
- area
- video
- picture
- view
- actor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Landscapes
- Processing Or Creating Images (AREA)
Abstract
The invention relates to the technical field of information, and particularly discloses a method for playing multi-position live programs, which mainly comprises the steps of playing a video in a full screen mode when playing the video, dividing the whole picture into three parts, namely a left-side A-area multi-position picture, a middle-side B-area main picture and a right-side C-area TA-only picture, confirming the width occupied by the A-area as mLW, calculating the actual height mLH of each video according to the video bandwidth height and the ratio of mLW, confirming the width occupied by the C-area as mRW, calculating the height mRH of each actor picture according to the width of the right-side actor picture and the ratio of mRW, and playing the multi-position video by a TextureView control in the B-area according to the selection of a user.
Description
Technical Field
The invention belongs to the technical field of information, and particularly relates to a method for playing multi-machine-position live programs.
Background
The multi-computer video playing is a brand-new technology which takes a broadband network as a transmission channel, takes an Android mobile phone as a terminal, integrates various technologies such as the internet, multimedia and the like, and provides various interactive services including video contents for users.
Through retrieval, the Chinese file application number is 201510920806.7, and discloses a method and a device for realizing multi-machine-position video synchronous playing, wherein a terminal acquires multi-machine-position video data source information of the same video program from a network server, the terminal judges and selects a machine position after acquiring multiple click operations of a user on a machine position information list and sends a data request to the network server, and the terminal plays video contents of the selected machine position in a main playing window and a slave playing window in sequence after receiving data. The invention also discloses a device for realizing the synchronous playing of the multi-camera video, wherein the terminal comprises an information acquisition module, a request sending module, and a video playing module and a network server comprise an information sending module and a request executing module. The invention also discloses a system for realizing the synchronous playing of the multi-position video, which comprises the terminal and the network server, and a user can select different live broadcast visual angles to watch or switch simultaneously according to the watching requirement, thereby providing humanized user experience for the user.
However, the invention still has the following defects:
the multi-machine-position video playing is not convenient to realize according to the view selected by the user, the playing of a plurality of close-up videos on the same screen cannot be met, and the watching experience effect of the user is reduced.
Disclosure of Invention
The invention aims to provide a method for playing a multi-machine-position live program, which realizes the playing of a multi-machine-position video according to left, middle and right areas of a video calculated systematically by an algorithm and a view selected by a user, improves better watching experience of the user, and has better effect particularly when an actor is watching a concert so as to solve the problems brought forward in the background technology.
In order to achieve the purpose, the invention adopts the following technical scheme:
a method for playing multi-machine live programs includes the following steps:
s1, full-screen playing is carried out during video playing, and the whole picture is divided into an area A, an area B and an area C, wherein the area A is a multi-camera picture, the middle area B is a playing main picture, and the area C on the right side is a TA (just in time) picture;
s2, confirming the occupied width of the area A as mLW, and calculating the actual height mLH of each video according to the video bandwidth height and the ratio of mLW;
s3, confirming the width occupied by the area C as mRW, and calculating the height mRH of each actor picture according to the width and the height of the right actor picture and the ratio of mRW;
and S4, the B area plays the multi-machine video through the TextureView control according to the selection of the user.
Preferably, the application scene of the video playing in step S1 is mainly to switch video frames and select close-up videos of fixed actors through a small left video when a concert is live.
Preferably, the area a in steps S1 and S2 is hidden outside the left side of the screen, and the area a needs to be hidden by moving left mLW by a settransflationx method for a distance of hiding the area a, where the area a is a video list, and the video list is divided into multiple views of 4K super high definition, top-down shot, dynamic shot, and stage close-up.
Preferably, when the area A is used for displaying the video list, a RecycleView control is used for displaying the data list, and then the TextureView is used for directly projecting the content stream into the View.
Preferably, the C area in steps S1 and S3 is hidden outside the right side of the screen, and is hidden by moving right mRW by settransflationx method, where the C area is a list of actor pictures, and when a user clicks on a certain actor picture, the B area is inserted into the angle video of the close-up actor.
Preferably, the C area sets the width of the main view B area to W when showing the actor close-up angle video, then aW =1/3*W is the width of the inserted video, the video height is calculated by the original video size and the proportion of aW to obtain aH, the C area is placed at the top-aH position, and when the user clicks the actor picture, the corresponding video is drawn from the top by the method of TranslateAnimation.
Preferably, in step S4, the TextureView control directly projects a content stream into the View to implement the Live preview function, and the TextureView control includes a multi-View button and a TA-only button.
Preferably, when playing multi-machine video, the following two situations occur:
a1, when a user clicks a multi-view button, calling a setTranslationX method for showing an A area view, namely moving an A area to the right by mLW, and moving a B area view to the right by mLW in a linkage manner, wherein at the moment, a multi-view video and a selected video are shown, so that multi-frequency simultaneous watching is realized;
a2, when the user clicks a view only TA button, the C area view calls a setTranslationX method to move left mRW for displaying the content of the C area, meanwhile, the B area moves left, and the B area displays a plurality of actor close-up videos.
Preferably, the settransitionx method in steps A1 and A2 is a program that changes the location of the view, but does not change the margin attribute in layout params of the view, and the area B in step A2 shows up to three close-up videos of the actor.
Compared with the prior art, the method for playing the multi-machine live program has the following advantages that:
the invention mainly plays the full screen when the video is played, the whole picture is divided into three parts, namely a left area A multi-machine-position picture, a middle area B plays a main picture, a right area C only watches a TA picture, and then the view areas of the area A, the area B and the area C are confirmed, so that the related functions of the area A, the area B and the area C are respectively realized, the left, middle and right areas of the area A, the area B and the area C of the video are systematically calculated according to an algorithm, the play of the multi-machine-position video is realized according to the view selected by a user, the better viewing experience of the user is improved, and particularly, the full screen video playing method has better effect when an actor is watched in a singing meeting.
Drawings
FIG. 1 is a block flow diagram of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. The specific embodiments described herein are merely illustrative of the invention and do not delimit the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The invention provides a method for playing multi-machine-position live programs, which is shown in figure 1 and comprises the following steps:
s1, full-screen playing is carried out during video playing, and the whole picture is divided into an area A, an area B and an area C, wherein the area A is a multi-camera picture, the middle area B is a playing main picture, and the area C on the right side is a TA (just in time) picture;
the application scene of the video playing is mainly that video pictures are switched through a small left video and a close-up video of a fixed actor is selected to watch the video when a concert is live.
S2, confirming the occupied width of the area A as mLW, and calculating the actual height mLH of each video according to the video bandwidth height and the ratio of mLW;
the area A is hidden outside the left side of a screen, the area A needs to be hidden by moving left mLW by a setTranslationX method for hiding the area A, the area A is a video list, the video list is divided into a plurality of views of 4K ultra high definition, top shot pictures, dynamic pictures and stage close-up scenes, and the views of other scenes can be added into the video list according to actual use requirements.
When the area A is used for displaying the video list, the data list is displayed by using a RecycleView control, and then the content stream is directly projected into the View by using the TextureView, so that the video viewing experience of a user is greatly improved, and panoramic, overlooking and stage View video can be viewed in the live program of the Android client.
S3, confirming the width occupied by the area C as mRW, and calculating the height mRH of each actor picture according to the width and the height of the right actor picture and the ratio of mRW;
the C area is hidden outside the right side of the screen, the C area needs to be hidden by moving the C area right mRW in a setTranslationX method, the C area is an actor picture list, when a user clicks a certain actor picture, the B area can be inserted into the angle video of the close-up of the actor, the video watching experience of the user is greatly improved through the method, and the video watching of the single actor view angle can be switched in the live program of the Android client.
When the area C displays the actor close-up angle video, the area width of a main view B is set as W, then aW =1/3*W is the width of an inserted video, the height of the video is calculated according to the size of the original video and the proportion of aW to obtain aH, the area C is placed at the position of top-aH, and when a user clicks an actor picture, the corresponding video is drawn from the top through a Translateanimation animation method;
translateanimation is the animation effect of movement. It has three constructors, which are:
public TranslateAnimation was skipped.
Public TranslateAnimation (float from XDelta, float to XDelta, float from YDelta, float to YDelta) this is one of the most commonly used construction methods we use,
the parameter represents the difference value of the starting point of the animation from the current View X coordinate;
the parameter represents the difference value of the animation end point from the current View X coordinate;
float fromYDelta, which represents the difference of the point where the animation starts from the current View Y coordinate;
float toYDelta) this parameter represents the difference of the point where the animation starts from the current View Y coordinate;
if view is at point A (x, y) then the animation is moved from point B (x + fromXDelta, y + fromYDelta) to point C (x + toXDelta, y + toYDelta).
Public TranslateAnimation (int fromxype, float from xvvalue, int toXType, float to xvvalue, int fromvtypee, float from yvalue, int topYType, float to yvalue) from xvType. Reference fromXValue where the first parameter is a value in the x-axis direction: the second parameter is a starting value of the first parameter type; the third parameter and the fourth parameter are end point reference and corresponding values in the x-axis direction; the relationship between the reference and the corresponding value is described by taking the x-axis as an example: the corresponding value should be a specific coordinate value, such as 100 TO 300, if the reference is selected as animation, indicating absolute screen pixel units, and should be understood as several times or percent with respect TO the control itself or the PARENT control if the reference is selected as animation, relative _ TO _ SELF or animation, relative _ TO _ part.
And S4, the B area plays the multi-machine video through the TextureView control according to the selection of the user.
The TextureView control directly projects content streams into the View and is used for realizing the function of Live preview, and the TextureView control comprises a multi-View button and a TA-only-View button.
When playing multi-machine video, the following two situations occur:
a1, when a user clicks a multi-view button, calling a setTranslationX method for showing an A area view, namely moving the A area to the right mLW, and moving a B area view to the right mLW in a linkage manner, and at the moment, showing a multi-view video and displaying a selected video to realize multi-frequency simultaneous watching;
and A2, when the user clicks the TA-only button, the view in the area C calls a setTranslationX method to move left mRW for displaying the content of the area C, the area B moves left, and the area B displays a plurality of actor close-up videos.
Wherein, the setTranslationX method in steps A1 and A2 is a program for changing the position of the view, but not changing the margin attribute in the LayoutParams of the view, and the area B in step A2 shows the close-up video of the actor at most three.
By integrating the above, the left, middle and right areas of the video area A, the area B and the area C are calculated systematically according to the algorithm, and then the playing of the multi-station video is realized according to the view selected by the user, so that the better watching experience of the user is improved, and particularly, the method has better effect when the actor is watching the concert.
Finally, it should be noted that: although the present invention has been described in detail with reference to the foregoing embodiments, it will be apparent to those skilled in the art that modifications may be made to the embodiments or portions thereof without departing from the spirit and scope of the invention.
Claims (9)
1. A method for playing multi-machine live programs is characterized in that: the method comprises the following steps:
s1, full-screen playing is carried out during video playing, and the whole picture is divided into an area A, an area B and an area C, wherein the area A is a multi-camera picture, the middle area B is a playing main picture, and the area C on the right side is a TA (just in time) picture;
s2, confirming the occupied width of the area A as mLW, and calculating the actual height mLH of each video according to the video bandwidth height and the ratio of mLW;
s3, confirming the width occupied by the area C as mRW, and calculating the height mRH of each actor picture according to the width and the height of the right actor picture and the ratio of mRW;
and S4, the B area plays the multi-machine video through the TextureView control according to the selection of the user.
2. The method of claim 1, wherein the method comprises: the application scene of the video playing in the step S1 is mainly that the video pictures are switched through the small left video and the close-up video of the fixed actors is selected to watch the video when the concert is live.
3. The method of claim 1, wherein the method comprises: in the steps S1 and S2, the area A is hidden outside the left side of the screen, the area A needs to be hidden by moving left mLW by a setTranslationX method for a distance of hiding the area A, the area A is a video list, and the video list is divided into a plurality of views of 4K ultra-high definition, a top-down picture, a dynamic picture and a stage close shot.
4. The method of claim 3, wherein the method comprises: and when the area A is used for displaying a video list, a data list is displayed by using a RecycleView control, and then the content stream is directly projected into the View by using the TextureView.
5. The method of claim 1, wherein the method comprises: the C area is hidden outside the right side of the screen in the steps S1 and S3, the C area needs to be hidden by moving right mRW to the distance from the hidden C area through a setTranslationX method, the C area is an actor picture list, and when a user clicks a certain actor picture, the angle video of the close-up of the actor is inserted into the B area.
6. The method of claim 5, wherein the method comprises: and when the C area shows the actor close-up angle video, setting the width of the area B of the main view as W, then setting aW =1/3*W as the width of the inserted video, calculating the height of the video according to the size of the original video and the proportion of aW to obtain aH, placing the C area at the position of top-aH, and when a user clicks the actor picture, drawing the corresponding video from the top by a Translateanimation animation method.
7. The method of claim 1, wherein the method comprises: in the step S4, the TextureView control directly projects the content stream into the View and is used for realizing the function of Live preview, and the TextureView control comprises a multi-View button and a TA-only button.
8. The method of claim 7, wherein the method comprises: when playing multi-machine video, the following two situations occur:
a1, when a user clicks a multi-view button, calling a setTranslationX method for showing an A area view, namely moving the A area to the right mLW, and moving a B area view to the right mLW in a linkage manner, and at the moment, showing a multi-view video and displaying a selected video to realize multi-frequency simultaneous watching;
a2, when the user clicks a view only TA button, the C area view calls a setTranslationX method to move left mRW for displaying the content of the C area, meanwhile, the B area moves left, and the B area displays a plurality of actor close-up videos.
9. The method of claim 8, wherein the method comprises: the setTranslationX method in steps A1 and A2 is a program that changes the location of the view, but does not change the margin attribute in the View's LayoutParams, and the area B in step A2 shows up to three close-up videos of the actor.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211106007.2A CN115665459A (en) | 2022-09-09 | 2022-09-09 | Method for playing multi-machine live program |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211106007.2A CN115665459A (en) | 2022-09-09 | 2022-09-09 | Method for playing multi-machine live program |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115665459A true CN115665459A (en) | 2023-01-31 |
Family
ID=84983489
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211106007.2A Pending CN115665459A (en) | 2022-09-09 | 2022-09-09 | Method for playing multi-machine live program |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115665459A (en) |
-
2022
- 2022-09-09 CN CN202211106007.2A patent/CN115665459A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101193698B1 (en) | Client-server architectures and methods for zoomable user interface | |
CN108900857B (en) | Multi-view video stream processing method and device | |
US20010013123A1 (en) | Customized program creation by splicing server based video, audio, or graphical segments | |
CN108989830A (en) | A kind of live broadcasting method, device, electronic equipment and storage medium | |
CN107197341B (en) | Dazzle screen display method and device based on GPU and storage equipment | |
US20080168512A1 (en) | System and Method to Implement Interactive Video Streaming | |
CN103282962A (en) | Sequencing content | |
US20150040165A1 (en) | Multi-source video navigation | |
US6891561B1 (en) | Providing visual context for a mobile active visual display of a panoramic region | |
CN111556357B (en) | Method, device and equipment for playing live video and storage medium | |
WO2023279793A1 (en) | Video playing method and apparatus | |
US11120615B2 (en) | Dynamic rendering of low frequency objects in a virtual reality system | |
US11012658B2 (en) | User interface techniques for television channel changes | |
WO2019052306A1 (en) | Video processing method and apparatus, and storage medium | |
JP2020527883A5 (en) | ||
CN105100870A (en) | Screenshot method and terminal equipment | |
CN115665459A (en) | Method for playing multi-machine live program | |
CN114666648B (en) | Video playing method and electronic equipment | |
CN112019921A (en) | Body motion data processing method applied to virtual studio | |
CN108307219B (en) | High-definition television panoramic video intelligent display method | |
CN115379277A (en) | VR panoramic video playing method and system based on IPTV service | |
CN113395527A (en) | Remote live broadcast virtual background cloud synthesis system based on VR technology | |
CN105812955A (en) | Rolling caption display method and set top box processor | |
CN115225928B (en) | Multi-type audio and video mixed broadcasting system and method | |
CN109600623A (en) | The control method and system of VR live video |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |