CN113364967A - Playing method and device of terminal return video - Google Patents

Playing method and device of terminal return video Download PDF

Info

Publication number
CN113364967A
CN113364967A CN202010145316.5A CN202010145316A CN113364967A CN 113364967 A CN113364967 A CN 113364967A CN 202010145316 A CN202010145316 A CN 202010145316A CN 113364967 A CN113364967 A CN 113364967A
Authority
CN
China
Prior art keywords
video
terminal
angle
information
playing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010145316.5A
Other languages
Chinese (zh)
Inventor
张光伟
王亮
方伟
鲜柯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
TD Tech Chengdu Co Ltd
Chengdu TD Tech Ltd
Original Assignee
Chengdu TD Tech Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu TD Tech Ltd filed Critical Chengdu TD Tech Ltd
Priority to CN202010145316.5A priority Critical patent/CN113364967A/en
Publication of CN113364967A publication Critical patent/CN113364967A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects

Abstract

The application discloses a playing method of a terminal returned video, which comprises the following steps: the playing equipment receives the video sent by the terminal and the shooting angle information of the video; and the playing equipment determines a shooting angle according to the shooting angle information, adjusts the placing angle of the playing equipment according to the shooting angle, and plays the video. By applying the method and the device, the loss of the transmitted information can be reduced as much as possible on the premise of ensuring the impression.

Description

Playing method and device of terminal return video
Technical Field
The present application relates to video processing technologies in communication systems, and in particular, to a method and an apparatus for playing a video returned by a terminal.
Background
With the development of mobile communication technology, video multimedia gradually becomes a popular use scene of industrial users, and the terminal device transmits the video back to a server or a large screen, so that the possibility of real-time scheduling and monitoring is provided. The mobile terminal is convenient to carry and flexible to operate, and becomes a main application scene of the requirement. However, the mobile terminal is returned to a large screen or a dispatching desk, and needs to adapt to the change of the terminal, and due to the difference between the shooting resolution and the playing resolution of the terminal, the playing end needs to reshape or cut the video so as to adapt to the size of the playing device. But this entails additional consumption of processing power and loss of video information.
A typical ratio of a mobile terminal is about 9:16 long: wide, which is suitable for hand-held scenes, and most scenes are shot vertically, but other angles of shooting may occur. Most of the playing display devices are liquid crystal screens, and the length: width is 16:9, which is suitable for multimedia playing, such as television or movies.
When the mobile terminal video is transmitted back, the playing device needs to cut or leave white to adapt to the vertical playing of the video, and the experience of the audience is not perfect no matter the video is cut or left white.
Disclosure of Invention
The application provides a playing method of a terminal returned video, which can reduce the loss of transmitted information as much as possible on the premise of ensuring the impression.
In order to achieve the purpose, the following technical scheme is adopted in the application:
a playing method of a video returned by a terminal is characterized by comprising the following steps:
the playing equipment receives the video sent by the terminal and the shooting angle information of the video;
and the playing equipment determines a shooting angle according to the shooting angle information, adjusts the placing angle of the playing equipment according to the shooting angle, and plays the video.
Preferably, the manner of obtaining the shooting angle information of the video is as follows:
the terminal determines real-time direction information acquired by a direction sensor when shooting the video, and calculates shooting angle information according to the real-time direction information;
and/or the terminal determines real-time acceleration information acquired by a gravity acceleration sensor when the video is shot, determines an included angle between a screen of the terminal and a horizontal plane according to the real-time acceleration information, and takes the included angle as shooting angle information.
Preferably, the adjusting the placing angle of the playing device according to the shooting angle includes:
the playing equipment outputs the shooting angle to an angle adjusting unit, the mechanical control device connected with the playing equipment controls the playing equipment to rotate to be consistent with the shooting angle, and the forward playing of the video is guaranteed.
Preferably, the determining an included angle between the screen of the terminal and the horizontal plane according to the real-time acceleration information includes:
calculating an angle theta Y between the Y axis and the horizontal plane [ arctan (Ay/sqrt (Ax + Az)) ] 180/pi according to the acceleration Ax in the X-axis direction, the acceleration Ay in the Y-axis direction and the acceleration Az in the Z-axis direction of the terminal screen;
wherein the X axis and the Y axis are parallel to the screen surface of the terminal, the Z axis is perpendicular to the screen surface of the terminal, and sqrt () represents an operation of square root.
According to the technical scheme, the playing equipment receives the video sent by the terminal and the shooting angle information of the video; the playing device determines a shooting angle according to the shooting angle information, adjusts the placing angle of the playing device according to the shooting angle, and plays the corresponding video. Through the mode, the playing equipment can adjust the placing angle of the playing equipment according to the shooting angle of the terminal, so that the placing angle is consistent with the recording angle of the video recording terminal, video playing is carried out, and therefore loss of transmitted information is reduced as much as possible on the premise that the impression is guaranteed.
Drawings
Fig. 1 is a basic schematic diagram of a video playing method returned by a terminal in the present application;
FIG. 2 is a schematic diagram of a terminal acquiring real-time direction information of the terminal through a direction sensor;
FIG. 3 is a schematic diagram of a terminal acquiring current acceleration information of the terminal through a gravity acceleration sensor;
FIG. 4 is a schematic diagram of an acceleration sensor chip placed horizontally;
FIG. 5 is a schematic view of the angle between the sides of FIG. 4;
FIG. 6 is a graph of the components of gravitational acceleration g in each axis;
fig. 7 is a diagram illustrating the acceleration of gravity g as a diagonal of the cube.
Detailed Description
For the purpose of making the objects, technical means and advantages of the present application more apparent, the present application will be described in further detail with reference to the accompanying drawings.
At present, typical scenes transmitted back to a playing device after being shot by a terminal include the following two scenes:
1. shooting by a terminal vertical screen, vertically displaying by a player, and keeping black on two sides;
2. and shooting by erecting the screen of the terminal, and displaying by the full screen of the player.
This application passes through the shooting angle at broadcast end perception terminal, autogiration playback device to reach and not lose the transmission information, and the purpose that full screen shows.
Specifically, when the mobile terminal records a video, no matter how the placement angle of the terminal changes, the mobile terminal faces the recorder as far as possible. Since the placement position of the playing end generally does not change, in order to adapt to the scene of shooting, the industry generally performs cutting or blank operation. And the transmitted information can be prevented from being lost as much as possible on the premise of ensuring the impression by proper setting and message transmission.
Fig. 1 is a schematic diagram of a basic flow of a method for playing a video returned by a terminal according to the present application. For convenience of description, the following method is described in a manner that a video recording terminal and a playback device interact with each other. As shown in fig. 1, the method includes:
step 201, the video recording terminal shoots a video and acquires shooting angle information.
The shooting angle information is used for reflecting the real-time shooting angle of the terminal when the video is shot. Various sensors carried on the terminal can be utilized, and shooting angle information can be acquired by utilizing the existing mode. Two typical approaches are given below:
1. acquisition of shooting angle information using a directional sensor
The direction sensor is mounted on the terminal, so that real-time direction information of the terminal can be acquired, as shown in fig. 2. The method can acquire the measured value of the terminal direction sensor while shooting the video, calculate the shooting angle by using the three currently measured return values, and transmit the calculation result as the shooting angle information to the playing device;
2. shooting angle information acquired by using gravity acceleration sensor
The acceleration sensor is called G-sensor, and obtains acceleration values of x, y and z axes. The x axis and the y axis are parallel to the surface of the mobile phone screen, and the z axis is perpendicular to the surface of the mobile phone screen. The acceleration of the three axes x, y and z includes the influence of gravity, and the unit is m/s 2. The mobile phone screen is placed on a desktop in an upward mode, the acceleration of the x axis is defaulted to 0, the acceleration of the y axis is defaulted to 0, and the acceleration of the z axis is defaulted to 9.81. The mobile phone screen is placed on the desktop downwards, and the z axis is-9.81. Therefore, the angle between the display screen of the terminal and the horizontal plane can be obtained through the measurement value of the gravity sensor, as shown in fig. 3. In the application, the included angle between the terminal display screen determined according to the measured value and the horizontal plane can be sent to the playing device as shooting angle information, only the angle between the Y axis and the horizontal plane needs to be determined, and the included angle determination method comprises the following steps:
the acceleration in the X-axis direction is Ax, and the included angle between the acceleration and the horizontal line is α 1 and the included angle α between the acceleration and the gravitational acceleration is α;
the acceleration in the Y-axis direction is Ay, the included angle between the acceleration and the horizontal line is beta 1, and the included angle between the acceleration and the gravity acceleration g is beta;
the acceleration in the Z-axis direction is Az, the included angle with the horizontal line is γ 1, and the included angle with the gravitational acceleration g is γ;
the included angle θ Y between the Y axis and the horizontal plane is β 1 × 180/pi ═ 180/pi [ arctan (Ay/sqrt (Ax × Ax + Az × Az)) ] × 180/pi. Where sqrt () represents the operation of square root.
The derivation of the calculation of the angle between the Y axis and the horizontal plane is given below:
if the gravitational acceleration chip is horizontally placed, the gravitational component in the X, Y direction is 0, and the gravitational component in the Z-axis direction is g. As shown in fig. 4, X is 0, Y is 0, and Z is g. If each side has some included angle with the horizontal direction, the image is as shown in fig. 5, the acceleration in the X-axis direction is Ax, the included angle with the horizontal line is α 1, and the included angle with the gravity acceleration is α; the acceleration in the Y-axis direction is Ay, the acceleration with the horizontal line is beta 1, and the included angle with the gravity acceleration g is beta; the acceleration in the Z-axis direction is Az, the acceleration with the horizontal line is gamma 1, and the included angle with the gravity acceleration g is gamma.
Based on the angle concept in FIG. 5, they are related to
α -90 degree- α 1, β -90 degree- β 1, γ -90 degree- γ 1 (1)
The components of g in each axial direction are: ax ═ gcos α, Ay ═ gcos β, Az ═ gcos γ;
substituting the data in the formula (1) into the formula:
ax is gcos α 1, Ay is gsin β 1, and Az is gsin γ 1. (2)
As shown in fig. 6, the size of each vertical dashed line is: g × Ax + gcos α 1 × gcos α 1, then:
gcosα1=sqrt(g*g-Ax*Ax),gcosβ1=sqrt(g*g-Ay*Ay),gcosγ1=sqrt(g*g-Az*Az))。 (3)
from the knowledge of the solid geometry, g corresponds to the diagonal of the cube, Ax, Ay, Az correspond to three sides, and as shown in fig. 7, the size of the dashed line is equal to Ay + Az, so according to the pythagorean theorem Ax + Ay + Az, g.
As can be seen from equations (2) and (3), (taking the X axis as an example) sin α 1 is Ax/g, cos α 1 is sqrt (g × g-Ax × Ax)/g, tan α 1 is (Ax/g)/[ sqrt (g × g-Ax)/g ] ═ Ax/sqrt (g-Ax): Ax/sqrt (Ay × Ay + Az). tan β 1 is Ay/sqrt (Ax + Az) and tan γ 1 is Az/sqrt (Ax + Ay).
Finally, the relation between the acceleration sensor value and the angular velocity value (radian) of the ADXL345 is obtained as follows:
tanα1=Ax/sqrt(Ay*Ay+Az*Az),
tanβ1=Ay/sqrt(Ax*Ax+Az*Az),
tanγ1=Az/sqrt(Ax*Ax+Ay*Ay)。
where α 1, β 1, γ 1 are the camber values of the X, Y, Z axis and the horizon, respectively (the value calculated by the inverse trigonometric function is the camber), and Ax, Ay, Az are the acceleration values in the three axes.
Then the camber values are:
α1=arctan(Ax/sqrt(Ay*Ay+Az*Az))
β1=arctan(Ay/sqrt(Ax*Ax+Az*Az))
γ1=arctan(Az/sqrt(Ax*Ax+Ay*Ay))
the data formula must then be used: radian is theta pi R/180. This gives θ ═ radian 180/π R, where R is 1. The angle values of the axes obtained finally are respectively as follows:
θx=α1*180/π=[arctan(Ax/sqrt(Ay*Ay+Az*Az))]*180/π
θy=β1*180/π=[arctan(Ay/sqrt(Ax*Ax+Az*Az))]*180/π
θz=γ1*180/π=[arctan(Az/sqrt(Ax*Ax+Ay*Ay))]*180/π
and reporting the theta y as the included angle between the screen and the horizontal direction.
The two modes of obtaining the shooting angle information by using the direction sensor and the acceleration sensor can be selected or both adopted, and the information obtained by the two modes is used as the shooting angle information and sent to the playing equipment.
Step 202, the video recording terminal sends the shot video and the shot angle information to the playing device.
The specific sending method may adopt various existing data transmission methods, and details are not described here.
And step 203, the playing device receives the shooting angle information and determines the shooting angle.
In this step, the playing device receives the shooting angle information, and determines a specific shooting angle by using the shooting angle information. The processing of this step acquires a specific shooting angle according to the shooting angle information.
And 204, adjusting the placing angle of the playing equipment according to the shooting angle, and playing the video.
The player in this application has the function of automatic rotation of the screen (e.g. + -90 deg.). The specific functions can be realized in various conventional manners. Preferably, the screen can be connected with the mechanical control device, the playing device outputs the shooting angle to the angle adjusting unit, the mechanical control device controls the playing device to rotate to be consistent with the shooting angle, the forward playing of the video, namely the up-down forward direction, is ensured, and the situation of up-down reversal cannot occur.
The method flow in the present application is now over.
In the terminal return video playing method provided by the application, the mobile terminal calculates the shooting angle and informs the playing device, so that the playing device can sense the video recording angle of the video recording terminal; the playing equipment is controlled to automatically rotate to the same angle as the angle when the video is recorded in a soft and hard combined mode, and the video is played. Therefore, no matter how the angle of the video recording terminal changes when the video is recorded, the playing equipment can automatically adjust the angle to play the video, and the transmitted information is not lost as much as possible on the premise of ensuring the impression.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (4)

1. A playing method of a video returned by a terminal is characterized by comprising the following steps:
the playing equipment receives the video sent by the terminal and the shooting angle information of the video;
and the playing equipment determines a shooting angle according to the shooting angle information, adjusts the placing angle of the playing equipment according to the shooting angle, and plays the video.
2. The method according to claim 1, wherein the capturing angle information of the video is obtained by:
the terminal determines real-time direction information acquired by a direction sensor when shooting the video, and calculates shooting angle information according to the real-time direction information;
and/or the terminal determines real-time acceleration information acquired by a gravity acceleration sensor when the video is shot, determines an included angle between a screen of the terminal and a horizontal plane according to the real-time acceleration information, and takes the included angle as shooting angle information.
3. The method of claim 1, wherein the adjusting the angle of the playing device according to the shooting angle comprises:
the playing equipment outputs the shooting angle to an angle adjusting unit, the mechanical control device connected with the playing equipment controls the playing equipment to rotate to be consistent with the shooting angle, and the forward playing of the video is guaranteed.
4. The method according to claim 2, wherein the determining an angle between a screen of the terminal and a horizontal plane according to the real-time acceleration information comprises:
calculating an angle theta Y between the Y axis and the horizontal plane [ arctan (Ay/sqrt (Ax + Az)) ] 180/pi according to the acceleration Ax in the X-axis direction, the acceleration Ay in the Y-axis direction and the acceleration Az in the Z-axis direction of the terminal screen;
wherein the X axis and the Y axis are parallel to the screen surface of the terminal, the Z axis is perpendicular to the screen surface of the terminal, and sqrt () represents an operation of square root.
CN202010145316.5A 2020-03-05 2020-03-05 Playing method and device of terminal return video Pending CN113364967A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010145316.5A CN113364967A (en) 2020-03-05 2020-03-05 Playing method and device of terminal return video

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010145316.5A CN113364967A (en) 2020-03-05 2020-03-05 Playing method and device of terminal return video

Publications (1)

Publication Number Publication Date
CN113364967A true CN113364967A (en) 2021-09-07

Family

ID=77523445

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010145316.5A Pending CN113364967A (en) 2020-03-05 2020-03-05 Playing method and device of terminal return video

Country Status (1)

Country Link
CN (1) CN113364967A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6798443B1 (en) * 1995-05-30 2004-09-28 Francis J. Maguire, Jr. Apparatus for inducing attitudinal head movements for passive virtual reality
CN102353351A (en) * 2011-06-28 2012-02-15 惠州Tcl移动通信有限公司 Lateral shooting angle detecting method, inclined shooting angle detecting method and mobile phone
US20140153897A1 (en) * 2012-12-03 2014-06-05 Canon Kabushiki Kaisha Display apparatus and control method thereof
CN104238640A (en) * 2013-06-18 2014-12-24 原建桥 Desktop display in changeable display mode
WO2016088493A1 (en) * 2014-12-03 2016-06-09 株式会社タカラトミー Motion-controlled video game device
CN108574806A (en) * 2017-03-09 2018-09-25 腾讯科技(深圳)有限公司 Video broadcasting method and device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6798443B1 (en) * 1995-05-30 2004-09-28 Francis J. Maguire, Jr. Apparatus for inducing attitudinal head movements for passive virtual reality
CN102353351A (en) * 2011-06-28 2012-02-15 惠州Tcl移动通信有限公司 Lateral shooting angle detecting method, inclined shooting angle detecting method and mobile phone
US20140153897A1 (en) * 2012-12-03 2014-06-05 Canon Kabushiki Kaisha Display apparatus and control method thereof
CN104238640A (en) * 2013-06-18 2014-12-24 原建桥 Desktop display in changeable display mode
WO2016088493A1 (en) * 2014-12-03 2016-06-09 株式会社タカラトミー Motion-controlled video game device
CN108574806A (en) * 2017-03-09 2018-09-25 腾讯科技(深圳)有限公司 Video broadcasting method and device

Similar Documents

Publication Publication Date Title
US9918118B2 (en) Apparatus and method for playback of audio-visual recordings
CN103336532B (en) The support device of 3D visual-aural system and 3D visual-aural system
US10021339B2 (en) Electronic device for generating video data
KR101778420B1 (en) System and method for adjusting orientation of captured video
US9215383B2 (en) System for enhancing video from a mobile camera
WO2018064831A1 (en) Tripod head, unmanned aerial vehicle and control method therefor
US10939068B2 (en) Image capturing device, image capturing system, image processing method, and recording medium
US20040006424A1 (en) Control system for tracking and targeting multiple autonomous objects
CN108476293B (en) Multifunctional camera, control method thereof, wearable device, cradle head and aircraft
CN112335264B (en) Apparatus and method for presenting audio signals for playback to a user
US11082607B2 (en) Systems and methods for generating composite depth images based on signals from an inertial sensor
US20220067974A1 (en) Cloud-Based Camera Calibration
US20210097696A1 (en) Motion estimation methods and mobile devices
CN115225815B (en) Intelligent target tracking shooting method, server, shooting system, equipment and medium
US20200364519A1 (en) Systems and methods for generating composite sets of data from different sensors
CN113364967A (en) Playing method and device of terminal return video
US20230009190A1 (en) Flight-capable rail-based system
KR20080006925A (en) The method and system to get the frame data of moving shot with camera on a vehicle and the location data from location base service or gps and the direction data of the vehicle to send to server through wireless internet by real time and to be used that by another vehicle
CN203204452U (en) 3D visual and auditory system supporting device and 3D visual and auditory system
WO2018010472A1 (en) Smart display device for controlling rotation of tripod head of unmanned aerial vehicle, and control system thereof
CN114143460A (en) Video display method and device and electronic equipment
WO2018010473A1 (en) Unmanned aerial vehicle cradle head rotation control method based on smart display device
CN111147840A (en) Automatic control and communication system for video and audio acquisition of 3D camera rocker arm
TWI373630B (en)
CN205356558U (en) System is watched in remote control and make a video recording and hold and watch end thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210907