CN107707817B - video shooting method and mobile terminal - Google Patents
video shooting method and mobile terminal Download PDFInfo
- Publication number
- CN107707817B CN107707817B CN201710889283.3A CN201710889283A CN107707817B CN 107707817 B CN107707817 B CN 107707817B CN 201710889283 A CN201710889283 A CN 201710889283A CN 107707817 B CN107707817 B CN 107707817B
- Authority
- CN
- China
- Prior art keywords
- mobile terminal
- posture
- image information
- axis
- shooting
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/265—Mixing
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention provides a video shooting method and a mobile terminal, wherein the video shooting method comprises the following steps: generating a control instruction when the initial posture of the mobile terminal meets a preset posture condition; shooting according to the control instruction, and acquiring each posture of the mobile terminal in the shooting process and image information shot under each posture; and generating a video file according to the image information. According to the technical scheme, in the shooting process, the posture change of the mobile terminal is recorded, the video file for displaying the complete three-dimensional picture of the target object is generated, the target object can be flexibly and conveniently displayed, and the real sense effect is simulated.
Description
Technical Field
The invention relates to the technical field of communication, in particular to a video shooting method and a mobile terminal.
background
When the current user carries out three-dimensional show to the target object, often need use the terminal camera to shoot an object from different angles, acquire many pictures, then synthesize through software again, perhaps use three-dimensional camera to shoot, generate picture or video file that certain object corresponds, if be the picture, the user need make a round trip to slide and switch the picture and watch each terminal surface of object, can't be convenient carry out three-dimensional show to the target object, if be the video, then need the user to constantly control and go forward and retreat, play back and forth even watches, hardly come a nimble show complete object according to the video of terminal shooting.
disclosure of Invention
the embodiment of the invention provides a video shooting method and a mobile terminal, and aims to solve the problem that pictures or video files generated by the terminal in the prior art are difficult to conveniently and flexibly carry out three-dimensional display on a complete object.
in a first aspect, an embodiment of the present invention provides a video shooting method, including:
generating a control instruction when the initial posture of the mobile terminal meets a preset posture condition;
Shooting according to the control instruction, and acquiring each posture of the mobile terminal in the shooting process and image information shot under each posture;
And generating a video file according to the image information.
in a second aspect, an embodiment of the present invention provides a mobile terminal, including:
The first generating module is used for generating a control instruction when the initial posture of the mobile terminal meets a preset posture condition;
The acquisition module is used for shooting according to the control instruction and acquiring each gesture of the mobile terminal in the shooting process and image information shot under each gesture;
And the second generation module is used for generating a video file according to the image information.
in a third aspect, an embodiment of the present invention provides a mobile terminal, including: the video shooting method comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, wherein the processor realizes the steps of the video shooting method when executing the computer program.
In a fourth aspect, the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to perform the steps in the above-mentioned video shooting method.
according to the technical scheme, the control instruction is generated when the initial posture of the mobile terminal meets the preset posture condition, shooting is carried out according to the control instruction, posture information and shot image information in the terminal shooting process are obtained, and the video file for displaying the complete three-dimensional picture of the target object is generated according to the image information, so that the target object can be flexibly and conveniently displayed, and a real sense effect can be simulated.
drawings
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
FIG. 1 is a schematic diagram of a video capture method according to an embodiment of the invention;
FIG. 2 is a schematic diagram of a three-dimensional rectangular coordinate system according to an embodiment of the present invention;
FIG. 3 is a schematic diagram illustrating a stereo capture of a target object according to an embodiment of the present invention;
FIG. 4a is a schematic diagram illustrating a mobile terminal rotating along a Y-axis according to an embodiment of the present invention;
FIG. 4b is a schematic diagram illustrating the mobile terminal rotating along the Y-axis according to the embodiment of the present invention;
FIG. 4c is a schematic diagram illustrating the mobile terminal rotating along the X-axis according to the embodiment of the present invention;
FIG. 4d is a schematic diagram illustrating the mobile terminal rotating along the X-axis according to the embodiment of the present invention;
FIG. 5 is a diagram of a mobile terminal according to an embodiment of the invention;
Fig. 6 is a schematic diagram of a hardware structure of a mobile terminal according to an embodiment of the present invention.
Detailed Description
the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
an embodiment of the present invention provides a video shooting method, as shown in fig. 1, including:
Step 101, generating a control instruction when the initial posture of the mobile terminal meets a preset posture condition.
The mobile terminal needs to detect a corresponding gesture before shooting a target object, and can generate a control instruction when the initial gesture of the mobile terminal meets a preset gesture condition, and a shooting process is carried out according to the control instruction.
when the initial posture of the mobile terminal meets the preset posture condition, the step of generating the control instruction comprises the following steps: acquiring a gravity acceleration component corresponding to each coordinate axis of the mobile terminal through a gravity sensor arranged in the mobile terminal; detecting whether the gravity acceleration component on each coordinate axis of the mobile terminal is in a corresponding preset range, if so, generating a control instruction by the mobile terminal; the mobile terminal corresponds to a three-dimensional rectangular coordinate system, and the three-dimensional rectangular coordinate system comprises an X axis, a Y axis and a Z axis which are perpendicular to each other.
the mobile terminal is internally provided with a gravity sensor, wherein the gravity sensor can be used for acquiring the gravity acceleration component on each coordinate axis corresponding to the mobile terminal. The X axis and the Y axis of the three-dimensional rectangular coordinate system corresponding to the mobile terminal are located on the end face where the display screen of the mobile terminal is located, the X axis is perpendicular to the Y axis, the X axis is parallel to the first direction of the mobile terminal, the first direction is the length direction or the width direction of the mobile terminal, and the Z axis is perpendicular to the end face where the display screen is located and is perpendicular to the X, Y axis. As shown in fig. 2, the embodiment of the present invention is described by taking an example in which the X axis is parallel to the width direction of the mobile terminal.
the gravity sensor arranged in the mobile terminal is used for acquiring X, Y, Z axial gravity acceleration components corresponding to each other, after X, Y, Z axial gravity acceleration components are acquired, X, Y, Z axial gravity acceleration components are detected, whether X, Y, Z axial gravity acceleration components are in corresponding preset ranges is judged, if X, Y, Z axial gravity acceleration components are in corresponding preset ranges, it is determined that the initial posture of the mobile terminal meets preset conditions, and when the initial posture of the mobile terminal meets the preset conditions, the mobile terminal generates a control command to carry out a shooting process.
In the embodiment of the invention, when the component X of the gravity acceleration corresponding to the X coordinate axis of the mobile terminalgThe value range is as follows: 3.0 > Xg> -3.0; component Y of gravity acceleration corresponding to Y coordinate axis of mobile terminalgThe value range is as follows: 10.0 > Ygis more than 8.0; component Z of gravity acceleration corresponding to Z coordinate axis of mobile terminalgthe value range is as follows: 3.0 > Xgand > -3.0, determining that the initial posture of the mobile terminal meets the preset condition, generating a control instruction by the mobile terminal at the moment, and executing the step 102 according to the control instruction.
It should be noted that the range corresponding to the gravitational acceleration component on the X, Y, Z axis in the embodiment of the present invention is only one embodiment in which the initial posture meets the preset posture condition, and those skilled in the art may determine that the initial posture of the mobile terminal meets the preset condition when the gravitational acceleration component on the X, Y, Z axis is in other ranges. In the embodiment of the invention, a switch is arranged on a shooting interface of the mobile terminal and is used for controlling the opening of the gravity sensor. When the gesture of the mobile terminal needs to be acquired, the switch can be turned on, and the current gesture information of the mobile terminal is detected through the gravity sensor.
And 102, shooting according to the control instruction, and acquiring each posture of the mobile terminal in the shooting process and image information shot in each posture.
after the mobile terminal generates the control instruction, a shooting process is executed according to the control instruction, wherein the mobile terminal acquires stereoscopic picture information of the target object during shooting, as shown in fig. 3, which is a schematic diagram illustrating stereoscopic shooting of the target object according to an embodiment of the present invention. When shooting, the camera needs to horizontally rotate to shoot and shoot up and down, and shoot at an omnidirectional angle to acquire required image information. And when shooting, acquiring each corresponding posture of the mobile terminal in the shooting process to form a relation between different postures and image information.
In the embodiment of the present invention, the step of performing shooting according to the control instruction, and acquiring each posture of the mobile terminal in the shooting process and image information shot in each posture includes:
according to the control instruction, performing stereo shooting on the target object through a camera of the mobile terminal according to a preset sequence; acquiring each posture of the mobile terminal in the shooting process and image information shot under each posture; the gesture of the mobile terminal includes: the gravity acceleration component information on each coordinate axis of the mobile terminal and the rotation angle information of the mobile terminal relative to the initial posture.
it should be noted that, a gyroscope sensor is further disposed in the mobile terminal according to the embodiment of the present invention, when the user performs rotational shooting around the target object, the angular velocity and the time integral may be obtained through the gyroscope sensor, and the rotation angle of the mobile terminal relative to the initial posture is calculated according to the angular velocity and the time integral. Furthermore, the gyroscope sensor and the gravity sensor can correspond to the same switch, and the switch arranged on the shooting interface of the mobile terminal can simultaneously control the gravity sensor and the gyroscope sensor to be turned on.
The gesture of the mobile terminal includes: the method comprises the steps of obtaining the component information of the gravity acceleration corresponding to each coordinate axis of the mobile terminal and the rotation angle information of the mobile terminal relative to the initial posture. The component values of the gravity acceleration corresponding to each coordinate axis of the mobile terminal can be obtained through the gravity sensor, and the rotation angle of the mobile terminal relative to the initial posture can be obtained through the gyroscope sensor.
after the mobile terminal generates the control instruction, a shooting process can be carried out according to the control instruction, and the target object can be shot in a three-dimensional mode according to a preset sequence during shooting, wherein the preset sequence for shooting can be that shooting is carried out in a horizontal rotation mode, then shooting is carried out on the upper portion of the target object, and finally shooting is carried out on the lower portion of the target object; the shooting method can also be used for shooting the upper part of the target object firstly, then horizontally shooting the target object and finally shooting the lower part of the target object; or after the target object is horizontally photographed by one half, the upper and lower parts of the target object are photographed, and finally the remaining horizontal photographing by one half is completed. Of course, other preset sequences are also possible, and are not limited in detail herein.
In the process of stereo shooting, the mobile terminal needs to continuously change the angle for shooting, so that in the shooting process, all postures of the mobile terminal need to be recorded in real time, and image information corresponding to all postures is recorded at the same time, so that the corresponding relation between the postures and the image information is formed. The attitude of the mobile terminal comprises the components of the gravity acceleration on the coordinate axes corresponding to each position and the rotation angle of the mobile terminal relative to the initial attitude.
The following describes the shooting sequence as horizontal shooting first, and then shooting the upper and lower parts of the target object in detail. The method specifically comprises the following steps: when the component X of the corresponding gravity acceleration on the X coordinate axis of the mobile terminalgthe corresponding weight on the Y coordinate axis of the mobile terminal is between (-3.0 to 3.0)Component Y of force accelerationgBetween (8.0-10.0), the component Z of the gravity acceleration corresponding to the Z coordinate axis of the mobile terminalgAnd when the initial posture of the mobile terminal is within (-3.0), determining that the initial posture of the mobile terminal meets a preset condition, and starting a shooting mode of the mobile terminal at the moment. Firstly, the mobile terminal horizontally rotates around a target object for shooting, and the gravity acceleration component X on each coordinate axis corresponding to each position is recorded in the shooting processg、Yg、Zgand a rotation angle θ with respect to the initial attitude. When θ is greater than or equal to 360 degrees, it is determined that the horizontal rotation photographing is completed. And then adjusting the mobile terminal to shoot the target object up and down. Component Z of gravity acceleration corresponding to Z coordinate axis of mobile terminalgand when the angle alpha of the upward rotation of the mobile terminal relative to the initial posture is more than 80 degrees between (8.0-10.0), determining that the shooting above the target object is finished. Component Z of gravity acceleration corresponding to Z coordinate axis of mobile terminalgand when the angle beta of the mobile terminal which rotates downwards relative to the initial posture is less than-7.0 and is more than 80 degrees, the shooting under the target object is determined to be finished.
and 103, generating a video file according to the image information.
And when shooting is finished, synthesizing the acquired image information according to the shooting time sequence to generate a video file. Namely, when the mobile terminal completes the three-dimensional shooting of the target object according to the preset sequence, the image information obtained by shooting is subjected to synthesis processing according to the shooting time sequence so as to obtain the video file for displaying the complete three-dimensional picture of the target object, so that the target object can be flexibly, conveniently and smoothly displayed, and the real sense effect can be simulated.
In the embodiment of the present invention, after generating a video file according to image information, the method further includes:
When the video file is played and the rotation of the mobile terminal is detected, the playing of the video file is paused, and the current image information is obtained; when the rotation angle of the mobile terminal is larger than a preset angle, generating a trigger instruction corresponding to the rotation angle; and acquiring target image information according to the trigger instruction and the current image information, and playing the target image information.
According to the invention, after the stereoscopic shooting of the target object is completed and the video file is generated, the generated video file needs to be played. In the process of playing the video file, when the rotation of the mobile terminal is detected, the playing of the currently played video file is paused, and after the video file is paused, the current image information is acquired. And detecting whether the rotation angle of the mobile terminal relative to the initial posture is larger than a preset angle or not, generating a trigger instruction when the rotation angle of the mobile terminal is larger than the preset angle, triggering jump of the image information according to the trigger instruction, and acquiring and playing the corresponding target image information.
In the embodiment of the present invention, the preset angle is 5 degrees, but is not limited to this degree, and those skilled in the art can set the degree of the preset angle according to actual requirements. The embodiment of the present invention will be described with reference to the example in which the predetermined angle is 5 degrees. And when the rotation angle of the mobile terminal is determined to be greater than 5 degrees, generating a trigger instruction corresponding to the rotation angle.
in the embodiment of the present invention, when the rotation angle of the mobile terminal is greater than the preset angle, the step of generating the trigger instruction corresponding to the rotation angle includes: when the rotation angle of the mobile terminal is larger than a preset angle, acquiring the rotation direction of the rotation angle; and generating a corresponding trigger instruction according to the rotating direction.
when the mobile terminal rotates, the corresponding rotating directions comprise a counterclockwise rotating direction along the Y axis of the mobile terminal, a clockwise rotating direction along the Y axis of the mobile terminal, a counterclockwise rotating direction along the X axis of the mobile terminal and a clockwise rotating direction along the X axis of the mobile terminal; the mobile terminal corresponds to a three-dimensional rectangular coordinate system, the three-dimensional rectangular coordinate system comprises an X axis, a Y axis and a Z axis which are perpendicular to each other, the X axis is parallel to a first direction of the mobile terminal, the Y axis is perpendicular to the first direction of the mobile terminal, and the first direction is the width or length direction of the mobile terminal. In the embodiment of the present invention, the X axis is taken as the width direction of the mobile terminal, and the Y axis is taken as the length direction of the mobile terminal for illustration.
when the rotation angle of the mobile terminal is determined to be larger than the preset angle, the rotation direction of the rotation angle needs to be detected, and after the corresponding rotation direction is obtained, a corresponding trigger instruction is generated according to the rotation direction. That is, the trigger command includes corresponding rotation direction information.
after the trigger instruction and the current image information are obtained, the steps of obtaining the target image information according to the trigger instruction and the current image information and playing the target image information comprise: acquiring the current posture of the mobile terminal corresponding to the current image information; determining a target posture of the mobile terminal according to the current posture and the rotation direction; and acquiring target image information corresponding to the target posture, and playing and displaying the target image information.
and after the trigger instruction is acquired, acquiring a current attitude corresponding to the current image information according to a pre-formed corresponding relationship between the attitude of the mobile terminal and the image information, wherein the current attitude comprises a rotation angle relative to the initial attitude of the mobile terminal and component values of corresponding gravitational acceleration on each coordinate axis. After the current posture is obtained, the target posture is determined according to the current posture and the rotating direction corresponding to the trigger instruction, after the target posture is determined, target image information corresponding to the target posture is obtained according to the corresponding relation between the posture of the mobile terminal and the image information, and then the obtained target image information is played and displayed through a display interface.
In the embodiment of the present invention, the step of determining the target posture of the mobile terminal according to the current posture and the rotation direction includes: and on the basis of the current posture, rotating the target angle according to the rotating direction to determine the target posture.
After the current posture corresponding to the current image information of the mobile terminal is determined, the target posture of the mobile terminal is determined by rotating the target angle on the basis of the current posture according to the rotating direction of the trigger instruction. The target angle is a preset angle, and the embodiment of the present invention is described by taking the target angle as 30 degrees as an example, but a person skilled in the art may set the target angle as another angle according to actual requirements.
Several specific rotation direction implementations of the mobile terminal are described below by way of example.
As shown in fig. 4a, when the counterclockwise rotation angle of the mobile terminal along the Y axis is greater than 5 degrees, a trigger instruction is generated, after the trigger instruction is obtained, the current posture corresponding to the current image information is determined, and a target posture is determined by rotating 30 degrees to the right on the basis of the current posture, so as to obtain corresponding target image information, where the target image information is a recorded picture fast-forwarded to 30 degrees to the right relative to the horizontal direction of the current image information.
As shown in fig. 4b, when the clockwise rotation angle of the mobile terminal along the Y axis is greater than 5 degrees, a trigger instruction is generated, after the trigger instruction is obtained, the current posture corresponding to the current image information is determined, and a target posture is determined by rotating 30 degrees to the left on the basis of the current posture, so as to obtain corresponding target image information, where the target image information is a recorded picture fast-forwarded to 30 degrees to the left in the horizontal direction relative to the current image information.
As shown in fig. 4c, when the mobile terminal rotates counterclockwise by an angle greater than 5 degrees along the X axis, a trigger instruction is generated, after the trigger instruction is obtained, the current posture corresponding to the current image information is determined, and on the basis of the current posture, the current posture is rotated upward by 30 degrees to determine a target posture, and corresponding target image information is obtained, where the target image information is a recorded picture which is fast-forwarded to a position 30 degrees upward from the vertical direction of the current image information.
As shown in fig. 4d, when the mobile terminal rotates clockwise by an angle greater than 5 degrees along the X axis, a trigger instruction is generated, after the trigger instruction is obtained, the current posture corresponding to the current image information is determined, and the current posture is rotated downward by 30 degrees on the basis of the current posture to determine the target posture, so as to obtain the corresponding target image information, where the target image information is a recorded picture which is fast-forwarded to a position 30 degrees downward from the vertical direction of the current image information.
According to the technical scheme, the control instruction is generated when the initial posture of the mobile terminal meets the preset posture condition, shooting is carried out according to the control instruction, posture information and shot image information in the terminal shooting process are obtained, and the video file for displaying the complete three-dimensional picture of the target object is generated according to the image information, so that the target object can be flexibly and conveniently displayed, and a real sense effect can be simulated.
Furthermore, in the process of watching the video, the user does not need to manually operate the progress bar to control the playing progress, the mobile terminal only needs to rotate by a certain angle, a trigger instruction can be generated, the mobile terminal jumps to the corresponding picture according to the trigger instruction to play, and when the user rotates the mobile terminal in different directions, the user can jump to different pictures to watch, so that the use convenience of the user is improved. Meanwhile, when the video is played, the target objects at different angles are displayed by combining the posture change of the mobile terminal and the posture information in the video file, so that the real picture experience can be simulated, and the visual experience of a user is improved.
an embodiment of the present invention further provides a mobile terminal, as shown in fig. 5, including:
The first generating module 10 is configured to generate a control instruction when the initial posture of the mobile terminal meets a preset posture condition;
The acquisition module 20 is configured to perform shooting according to the control instruction, and acquire each posture of the mobile terminal in the shooting process and image information shot in each posture;
and a second generating module 30, configured to generate a video file according to the image information.
Wherein, the first generating module 10 includes:
The first obtaining submodule 11 is configured to obtain a gravity acceleration component corresponding to each coordinate axis of the mobile terminal through a gravity sensor arranged in the mobile terminal;
The processing submodule 12 is configured to detect whether a gravity acceleration component on each coordinate axis of the mobile terminal is within a corresponding preset range, and if so, the mobile terminal generates a control instruction;
The mobile terminal corresponds to a three-dimensional rectangular coordinate system, and the three-dimensional rectangular coordinate system comprises an X axis, a Y axis and a Z axis which are perpendicular to each other.
Wherein, the obtaining module 20 includes:
The shooting submodule 21 is used for performing stereo shooting on the target object according to a preset sequence through a camera of the mobile terminal according to the control instruction;
the second obtaining submodule 22 is configured to obtain each posture of the mobile terminal in the shooting process and image information shot in each posture;
The gesture of the mobile terminal includes: the gravity acceleration component information on each coordinate axis of the mobile terminal and the rotation angle information of the mobile terminal relative to the initial posture.
Wherein the second generating module 30 is further configured to:
And when shooting is finished, synthesizing the acquired image information according to the shooting time sequence to generate a video file.
wherein, mobile terminal still includes:
The processing module 40 is configured to, after the second generating module 30 generates a video file according to the image information, pause playing the video file when the video file is played and the mobile terminal is detected to be rotated, and acquire current image information;
A third generating module 50, configured to generate a trigger instruction corresponding to the rotation angle of the mobile terminal when the rotation angle is greater than the preset angle;
And the playing module 60 is configured to obtain the target image information according to the trigger instruction and the current image information, and play the target image information.
wherein the third generating module 50 includes:
the third obtaining submodule 51 is configured to obtain a rotation direction of the rotation angle when the rotation angle of the mobile terminal is greater than a preset angle;
the generating submodule 52 is configured to generate a corresponding trigger instruction according to the rotation direction;
wherein the direction of rotation includes: a direction of counterclockwise rotation along a Y-axis of the mobile terminal, a direction of clockwise rotation along a Y-axis of the mobile terminal, a direction of counterclockwise rotation along an X-axis of the mobile terminal, and a direction of clockwise rotation along an X-axis of the mobile terminal;
The mobile terminal corresponds to a three-dimensional rectangular coordinate system, the three-dimensional rectangular coordinate system comprises an X axis, a Y axis and a Z axis which are perpendicular to each other, the X axis is parallel to a first direction of the mobile terminal, the Y axis is perpendicular to the first direction of the mobile terminal, and the first direction is the width or length direction of the mobile terminal.
the playing module 60 includes:
the fourth obtaining submodule 61 is configured to obtain a current posture of the mobile terminal corresponding to the current image information;
A determining submodule 62, configured to determine a target posture of the mobile terminal according to the current posture and the rotation direction;
And the playing submodule 63 is configured to acquire target image information corresponding to the target posture, and play and display the target image information.
wherein the determination submodule 62 is further configured to:
And on the basis of the current posture, rotating the target angle according to the rotating direction to determine the target posture.
The mobile terminal provided by the embodiment of the present invention can implement each process implemented by the mobile terminal in the method embodiment of fig. 1, and is not described herein again in order to avoid repetition. Therefore, the control instruction is generated when the initial posture of the mobile terminal meets the preset posture condition, shooting is carried out according to the control instruction, the posture information and the shot image information in the terminal shooting process are obtained, and the video file for displaying the complete three-dimensional picture of the target object is generated according to the image information, so that the target object can be flexibly and conveniently displayed, and the real sense organ effect is simulated.
Furthermore, in the process of watching the video, the user does not need to manually operate the progress bar to control the playing progress, the mobile terminal only needs to rotate by a certain angle, a trigger instruction can be generated, the mobile terminal jumps to the corresponding picture according to the trigger instruction to play, and when the user rotates the mobile terminal in different directions, the user can jump to different pictures to watch, so that the use convenience of the user is improved. Meanwhile, when the video is played, the target objects at different angles are displayed by combining the posture change of the mobile terminal and the posture information in the video file, so that the real picture experience can be simulated, and the visual experience of a user is improved.
Fig. 6 is a schematic diagram of a hardware structure of a mobile terminal for implementing various embodiments of the present invention, where the mobile terminal 600 includes, but is not limited to: a radio frequency unit 601, a network module 602, an audio output unit 603, an input unit 604, a sensor 605, a display unit 606, a user input unit 607, an interface unit 608, a memory 609, a processor 610, and a power supply 611. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 6 is not intended to be limiting of mobile terminals, and that a mobile terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the mobile terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
the processor 610 is configured to generate a control instruction when an initial posture of the mobile terminal meets a preset posture condition; shooting according to the control instruction, and acquiring each posture of the mobile terminal in the shooting process and image information shot under each posture; and generating a video file according to the image information.
Optionally, when the initial posture of the mobile terminal meets the preset posture condition and generates the control instruction, the processor 610 is further configured to: acquiring a gravity acceleration component corresponding to each coordinate axis of the mobile terminal through a gravity sensor arranged in the mobile terminal; detecting whether the gravity acceleration component on each coordinate axis of the mobile terminal is in a corresponding preset range, if so, generating a control instruction by the mobile terminal; the mobile terminal corresponds to a three-dimensional rectangular coordinate system, and the three-dimensional rectangular coordinate system comprises an X axis, a Y axis and a Z axis which are perpendicular to each other.
Optionally, when shooting is performed according to the control instruction and image information shot by the mobile terminal in each posture and each posture in the shooting process is acquired, the processor 610 is further configured to: according to the control instruction, performing stereo shooting on the target object through a camera of the mobile terminal according to a preset sequence; acquiring each posture of the mobile terminal in the shooting process and image information shot under each posture; the gesture of the mobile terminal includes: the gravity acceleration component information on each coordinate axis of the mobile terminal and the rotation angle information of the mobile terminal relative to the initial posture.
Optionally, when generating the video file according to the image information, the processor 610 is further configured to: and when shooting is finished, synthesizing the acquired image information according to the shooting time sequence to generate a video file.
Optionally, after generating the video file according to the image information, the processor 610 is further configured to: when the video file is played and the rotation of the mobile terminal is detected, the playing of the video file is paused, and the current image information is obtained; when the rotation angle of the mobile terminal is larger than a preset angle, generating a trigger instruction corresponding to the rotation angle; and acquiring target image information according to the trigger instruction and the current image information, and playing the target image information.
optionally, when the rotation angle of the mobile terminal is greater than the preset angle and a trigger instruction corresponding to the rotation angle is generated, the processor 610 is further configured to: when the rotation angle of the mobile terminal is larger than a preset angle, acquiring the rotation direction of the rotation angle; generating a corresponding trigger instruction according to the rotation direction; wherein the direction of rotation includes: a direction of counterclockwise rotation along a Y-axis of the mobile terminal, a direction of clockwise rotation along a Y-axis of the mobile terminal, a direction of counterclockwise rotation along an X-axis of the mobile terminal, and a direction of clockwise rotation along an X-axis of the mobile terminal; the mobile terminal corresponds to a three-dimensional rectangular coordinate system, the three-dimensional rectangular coordinate system comprises an X axis, a Y axis and a Z axis which are perpendicular to each other, the X axis is parallel to a first direction of the mobile terminal, the Y axis is perpendicular to the first direction of the mobile terminal, and the first direction is the width or length direction of the mobile terminal.
Optionally, when obtaining the target image information according to the trigger instruction and the current image information and playing the target image information, the processor 610 is further configured to: acquiring the current posture of the mobile terminal corresponding to the current image information; determining a target posture of the mobile terminal according to the current posture and the rotation direction; and acquiring target image information corresponding to the target posture, and displaying the target image information through the display unit 606.
Optionally, when determining the target posture of the mobile terminal according to the current posture and the rotation direction, the processor 610 is further configured to: and on the basis of the current posture, rotating the target angle according to the rotating direction to determine the target posture.
According to the technical scheme, the control instruction is generated when the initial posture of the mobile terminal meets the preset posture condition, shooting is carried out according to the control instruction, posture information and shot image information in the terminal shooting process are obtained, and the video file for displaying the complete three-dimensional picture of the target object is generated according to the image information, so that the target object can be flexibly and conveniently displayed, and a real sense effect can be simulated.
Furthermore, in the process of watching the video, the user does not need to manually operate the progress bar to control the playing progress, the mobile terminal only needs to rotate by a certain angle, a trigger instruction can be generated, the mobile terminal jumps to the corresponding picture according to the trigger instruction to play, and when the user rotates the mobile terminal in different directions, the user can jump to different pictures to watch, so that the use convenience of the user is improved. Meanwhile, when the video is played, the target objects at different angles are displayed by combining the posture change of the mobile terminal and the posture information in the video file, so that the real picture experience can be simulated, and the visual experience of a user is improved.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 601 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 610; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 601 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. Further, the radio frequency unit 601 may also communicate with a network and other devices through a wireless communication system.
The mobile terminal provides the user with wireless broadband internet access through the network module 602, such as helping the user send and receive e-mails, browse webpages, access streaming media, and the like.
The audio output unit 603 may convert audio data received by the radio frequency unit 601 or the network module 602 or stored in the memory 609 into an audio signal and output as sound. Also, the audio output unit 603 may also provide audio output related to a specific function performed by the mobile terminal 600 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 603 includes a speaker, a buzzer, a receiver, and the like.
The input unit 604 is used to receive audio or video signals. The input Unit 604 may include a Graphics Processing Unit (GPU) 6041 and a microphone 6042, and the Graphics processor 6041 processes image data of a still picture or video obtained by an image capturing apparatus (such as a camera) in a video capture mode or an image capture mode. The processed image frames may be displayed on the display unit 606. The image frames processed by the graphic processor 6041 may be stored in the memory 609 (or other storage medium) or transmitted via the radio frequency unit 601 or the network module 602. The microphone 6042 can receive sound, and can process such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 601 in case of the phone call mode.
the mobile terminal 600 also includes at least one sensor 605, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 6061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 6061 and/or the backlight when the mobile terminal 600 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of the mobile terminal (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 605 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 606 is used to display information input by the user or information provided to the user. The Display unit 606 may include a Display panel 6061, and the Display panel 6061 may be configured by a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 607 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the user input unit 607 includes a touch panel 6071 and other input devices 6072. Touch panel 6071, also referred to as a touch screen, may collect touch operations by a user on or near it (e.g., operations by a user on or near touch panel 6071 using a finger, stylus, or any suitable object or accessory). The touch panel 6071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 610, receives a command from the processor 610, and executes the command. In addition, the touch panel 6071 can be implemented by various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The user input unit 607 may include other input devices 6072 in addition to the touch panel 6071. Specifically, the other input devices 6072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a track ball, a mouse, and a joystick, which are not described herein again.
further, the touch panel 6071 can be overlaid on the display panel 6061, and when the touch panel 6071 detects a touch operation on or near the touch panel 6071, the touch operation is transmitted to the processor 610 to determine the type of the touch event, and then the processor 610 provides a corresponding visual output on the display panel 6061 according to the type of the touch event. Although the touch panel 6071 and the display panel 6061 are shown in fig. 6 as two separate components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 6071 and the display panel 6061 may be integrated to implement the input and output functions of the mobile terminal, and is not limited herein.
the interface unit 608 is an interface through which an external device is connected to the mobile terminal 600. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 608 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 600 or may be used to transmit data between the mobile terminal 600 and external devices.
the memory 609 may be used to store software programs as well as various data. The memory 609 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 609 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
the processor 610 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 609 and calling data stored in the memory 609, thereby integrally monitoring the mobile terminal. Processor 610 may include one or more processing units; preferably, the processor 610 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 610.
the mobile terminal 600 may further include a power supply 611 (e.g., a battery) for supplying power to the various components, and preferably, the power supply 611 is logically connected to the processor 610 via a power management system, so that functions of managing charging, discharging, and power consumption are performed via the power management system.
in addition, the mobile terminal 600 includes some functional modules that are not shown, and are not described in detail herein.
Preferably, an embodiment of the present invention further provides a mobile terminal, which includes a processor 610, a memory 609, and a computer program stored in the memory 609 and capable of running on the processor 610, where the computer program is executed by the processor 610 to implement each process of the above video shooting method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the video shooting method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.
Claims (16)
1. a video capture method, comprising:
generating a control instruction when the initial posture of the mobile terminal meets a preset posture condition;
Shooting according to the control instruction, and acquiring each posture of the mobile terminal in the shooting process and image information shot under each posture;
Generating a video file according to the image information;
When the initial posture of the mobile terminal meets the preset posture condition, the step of generating the control instruction comprises the following steps:
Acquiring a gravity acceleration component corresponding to each coordinate axis of the mobile terminal through a gravity sensor arranged in the mobile terminal;
Detecting whether the gravity acceleration component on each coordinate axis of the mobile terminal is in a corresponding preset range, if so, generating the control instruction by the mobile terminal;
The mobile terminal corresponds to a three-dimensional rectangular coordinate system, and the three-dimensional rectangular coordinate system comprises an X axis, a Y axis and a Z axis which are perpendicular to each other;
component of gravitational acceleration X on the X axisgthe corresponding preset range is as follows: 3.0 > Xg> -3.0; component of gravitational acceleration Y on the Y axisgthe corresponding preset range is as follows: 10.0 > YgIs more than 8.0; component of gravitational acceleration Z on the Z axisgthe corresponding preset range is as follows: 3.0 > Zg>-3.0;
The shooting according to the control instruction to obtain each gesture of the mobile terminal in the shooting process and the image information shot in each gesture comprises:
when the rotation angle theta of the current posture relative to the initial posture is larger than or equal to 360 degrees, determining that horizontal rotation shooting is finished;
At ZgWhen the current posture is larger than 8.0 and smaller than 10.0 and the upward rotation angle alpha of the current posture relative to the initial posture is larger than 80 degrees, the shooting above the target object is determined to be finished;
At ZgAnd when the current posture is rotated downwards by an angle beta larger than 80 degrees relative to the initial posture, determining that the shooting below the target object is finished.
2. The video shooting method according to claim 1, wherein the step of shooting according to the control instruction, and acquiring the respective postures of the mobile terminal during shooting and the image information shot in the respective postures comprises:
according to the control instruction, performing stereo shooting on a target object through a camera of the mobile terminal according to a preset sequence;
acquiring each gesture of the mobile terminal in the shooting process and image information shot under each gesture;
The gesture of the mobile terminal includes: the gravity acceleration component information on each coordinate axis of the mobile terminal and the rotation angle information of the mobile terminal relative to the initial posture.
3. the video shooting method according to claim 1, wherein the step of generating a video file based on the image information comprises:
And when shooting is finished, synthesizing the acquired image information according to the shooting time sequence to generate the video file.
4. the video shooting method according to claim 1, further comprising, after generating a video file based on the image information:
when the video file is played and the rotation of the mobile terminal is detected, the video file is paused to be played, and the current image information is obtained;
when the rotation angle of the mobile terminal is larger than a preset angle, generating a trigger instruction corresponding to the rotation angle;
And acquiring target image information according to the trigger instruction and the current image information, and playing the target image information.
5. The video shooting method according to claim 4, wherein the step of generating the trigger instruction corresponding to the rotation angle when the rotation angle of the mobile terminal is greater than a preset angle comprises:
when the rotation angle of the mobile terminal is larger than a preset angle, acquiring the rotation direction of the rotation angle;
Generating a corresponding trigger instruction according to the rotation direction;
Wherein the direction of rotation comprises: a direction of counterclockwise rotation along a Y-axis of the mobile terminal, a direction of clockwise rotation along a Y-axis of the mobile terminal, a direction of counterclockwise rotation along an X-axis of the mobile terminal, and a direction of clockwise rotation along an X-axis of the mobile terminal;
the mobile terminal corresponds to a three-dimensional rectangular coordinate system, the three-dimensional rectangular coordinate system comprises an X axis, a Y axis and a Z axis which are perpendicular to each other, the X axis is parallel to the first direction of the mobile terminal, the Y axis is perpendicular to the first direction of the mobile terminal, and the first direction is the width or length direction of the mobile terminal.
6. The video shooting method of claim 5, wherein the step of obtaining target image information according to the trigger instruction and the current image information and playing the target image information comprises:
Acquiring the current posture of the mobile terminal corresponding to the current image information;
Determining a target posture of the mobile terminal according to the current posture and the rotating direction;
And acquiring the target image information corresponding to the target posture, and playing and displaying the target image information.
7. the video shooting method according to claim 6, wherein the step of determining the target posture of the mobile terminal based on the current posture and the rotation direction comprises:
And on the basis of the current posture, rotating a target angle according to the rotating direction to determine the target posture.
8. A mobile terminal, comprising:
the first generating module is used for generating a control instruction when the initial posture of the mobile terminal meets a preset posture condition;
The acquisition module is used for shooting according to the control instruction and acquiring each gesture of the mobile terminal in the shooting process and image information shot under each gesture;
The second generation module is used for generating a video file according to the image information;
Wherein the first generating module comprises:
The first obtaining submodule is used for obtaining a gravity acceleration component corresponding to each coordinate axis of the mobile terminal through a gravity sensor arranged in the mobile terminal;
the processing submodule is used for detecting whether the gravity acceleration component on each coordinate axis of the mobile terminal is in a corresponding preset range, and if so, the mobile terminal generates the control instruction;
The mobile terminal corresponds to a three-dimensional rectangular coordinate system, and the three-dimensional rectangular coordinate system comprises an X axis, a Y axis and a Z axis which are perpendicular to each other;
Component of gravitational acceleration X on the X axisgthe corresponding preset range is as follows: 3.0 > Xg> -3.0; component of gravitational acceleration Y on the Y axisgThe corresponding preset range is as follows: 10.0 > Ygis more than 8.0; component of gravitational acceleration Z on the Z axisgthe corresponding preset range is as follows: 3.0 > Zg>-3.0;
The shooting according to the control instruction to obtain each gesture of the mobile terminal in the shooting process and the image information shot in each gesture comprises:
when the rotation angle theta of the current posture relative to the initial posture is larger than or equal to 360 degrees, determining that horizontal rotation shooting is finished;
At ZgWhen the current posture is larger than 8.0 and smaller than 10.0 and the upward rotation angle alpha of the current posture relative to the initial posture is larger than 80 degrees, the shooting above the target object is determined to be finished;
At ZgAnd when the current posture is rotated downwards by an angle beta larger than 80 degrees relative to the initial posture, determining that the shooting below the target object is finished.
9. the mobile terminal of claim 8, wherein the obtaining module comprises:
the shooting submodule is used for carrying out three-dimensional shooting on the target object according to the control instruction and a preset sequence through a camera of the mobile terminal;
The second acquisition submodule is used for acquiring each gesture of the mobile terminal in the shooting process and image information shot under each gesture;
the gesture of the mobile terminal includes: the gravity acceleration component information on each coordinate axis of the mobile terminal and the rotation angle information of the mobile terminal relative to the initial posture.
10. The mobile terminal of claim 8, wherein the second generating module is further configured to:
and when shooting is finished, synthesizing the acquired image information according to the shooting time sequence to generate the video file.
11. the mobile terminal of claim 8, wherein the mobile terminal further comprises:
The processing module is used for pausing playing of the video file and acquiring current image information when the mobile terminal is detected to rotate when the video file is played after the second generating module generates the video file according to the image information;
the third generation module is used for generating a trigger instruction corresponding to the rotation angle when the rotation angle of the mobile terminal is larger than a preset angle;
And the playing module is used for acquiring target image information according to the triggering instruction and the current image information and playing the target image information.
12. the mobile terminal of claim 11, wherein the third generating module comprises:
the third obtaining submodule is used for obtaining the rotating direction of the rotating angle when the rotating angle of the mobile terminal is larger than the preset angle;
The generating submodule is used for generating a corresponding trigger instruction according to the rotating direction;
wherein the direction of rotation comprises: a direction of counterclockwise rotation along a Y-axis of the mobile terminal, a direction of clockwise rotation along a Y-axis of the mobile terminal, a direction of counterclockwise rotation along an X-axis of the mobile terminal, and a direction of clockwise rotation along an X-axis of the mobile terminal;
The mobile terminal corresponds to a three-dimensional rectangular coordinate system, the three-dimensional rectangular coordinate system comprises an X axis, a Y axis and a Z axis which are perpendicular to each other, the X axis is parallel to the first direction of the mobile terminal, the Y axis is perpendicular to the first direction of the mobile terminal, and the first direction is the width or length direction of the mobile terminal.
13. The mobile terminal of claim 12, wherein the play module comprises:
the fourth obtaining submodule is used for obtaining the current posture of the mobile terminal corresponding to the current image information;
The determining submodule is used for determining the target posture of the mobile terminal according to the current posture and the rotating direction;
And the playing sub-module is used for acquiring the target image information corresponding to the target posture and playing and displaying the target image information.
14. The mobile terminal of claim 13, wherein the determination sub-module is further configured to:
and on the basis of the current posture, rotating a target angle according to the rotating direction to determine the target posture.
15. A mobile terminal, comprising: memory, processor and computer program stored on the memory and executable on the processor, which when executed by the processor implements the steps in the video capturing method of any of claims 1 to 7.
16. A computer-readable storage medium, on which a computer program is stored, which program, when being executed by a processor, carries out the steps of the video capturing method according to any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710889283.3A CN107707817B (en) | 2017-09-27 | 2017-09-27 | video shooting method and mobile terminal |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710889283.3A CN107707817B (en) | 2017-09-27 | 2017-09-27 | video shooting method and mobile terminal |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107707817A CN107707817A (en) | 2018-02-16 |
CN107707817B true CN107707817B (en) | 2019-12-17 |
Family
ID=61176185
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710889283.3A Active CN107707817B (en) | 2017-09-27 | 2017-09-27 | video shooting method and mobile terminal |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107707817B (en) |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108462848B (en) * | 2018-03-23 | 2021-08-24 | 成都合能创越软件有限公司 | Low-power law enforcement recorder |
CN110830704B (en) * | 2018-08-07 | 2021-10-22 | 纳宝株式会社 | Method and device for generating rotating image |
CN110177228B (en) * | 2018-11-06 | 2021-11-26 | 广东小天才科技有限公司 | Video recording method based on wearable device and wearable device |
CN109587451A (en) * | 2018-12-25 | 2019-04-05 | 青岛小鸟看看科技有限公司 | A kind of video capture device and its control method showing equipment for virtual reality |
CN111949114A (en) * | 2019-05-15 | 2020-11-17 | 中国移动通信有限公司研究院 | Image processing method and device and terminal |
CN112887777B (en) * | 2019-11-29 | 2022-12-23 | 阿里巴巴集团控股有限公司 | Interactive prompting method and device for interactive video, electronic equipment and storage medium |
CN111638795B (en) * | 2020-06-05 | 2024-06-11 | 上海商汤智能科技有限公司 | Method and device for controlling virtual object display state |
CN112839170B (en) * | 2020-12-31 | 2022-07-05 | 上海米哈游天命科技有限公司 | Shooting method, shooting device, electronic equipment and storage medium |
CN113791640A (en) * | 2021-09-10 | 2021-12-14 | 深圳市道通智能航空技术股份有限公司 | Image acquisition method and device, aircraft and storage medium |
CN113891000B (en) * | 2021-11-02 | 2024-06-14 | 维沃移动通信有限公司 | Shooting method and device and electronic equipment |
CN114040183B (en) * | 2021-11-08 | 2024-04-30 | 深圳传音控股股份有限公司 | Image processing method, mobile terminal and storage medium |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103530799A (en) * | 2013-10-22 | 2014-01-22 | 惠州Tcl移动通信有限公司 | Browsing method for realizing palm apartment viewing according to 3D (three-dimensional) galleries and 3D apartment viewing system |
CN104038694A (en) * | 2014-05-27 | 2014-09-10 | 小米科技有限责任公司 | Method and device for displaying panoramic image |
CN104394451A (en) * | 2014-12-05 | 2015-03-04 | 宁波菊风系统软件有限公司 | Video presenting method for intelligent mobile terminal |
CN106817625A (en) * | 2016-12-19 | 2017-06-09 | 广州视源电子科技股份有限公司 | Method, device and system for adjusting viewing angle of panoramic video and control terminal |
CN107026973A (en) * | 2016-02-02 | 2017-08-08 | 株式会社摩如富 | Image processing apparatus, image processing method and photographic auxiliary equipment |
-
2017
- 2017-09-27 CN CN201710889283.3A patent/CN107707817B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103530799A (en) * | 2013-10-22 | 2014-01-22 | 惠州Tcl移动通信有限公司 | Browsing method for realizing palm apartment viewing according to 3D (three-dimensional) galleries and 3D apartment viewing system |
CN104038694A (en) * | 2014-05-27 | 2014-09-10 | 小米科技有限责任公司 | Method and device for displaying panoramic image |
CN104394451A (en) * | 2014-12-05 | 2015-03-04 | 宁波菊风系统软件有限公司 | Video presenting method for intelligent mobile terminal |
CN107026973A (en) * | 2016-02-02 | 2017-08-08 | 株式会社摩如富 | Image processing apparatus, image processing method and photographic auxiliary equipment |
CN106817625A (en) * | 2016-12-19 | 2017-06-09 | 广州视源电子科技股份有限公司 | Method, device and system for adjusting viewing angle of panoramic video and control terminal |
Also Published As
Publication number | Publication date |
---|---|
CN107707817A (en) | 2018-02-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107707817B (en) | video shooting method and mobile terminal | |
CN108989672B (en) | Shooting method and mobile terminal | |
CN109525874B (en) | Screen capturing method and terminal equipment | |
CN108153422B (en) | Display object control method and mobile terminal | |
CN108459815B (en) | Display control method and mobile terminal | |
CN110174993B (en) | Display control method, terminal equipment and computer readable storage medium | |
CN108628515B (en) | Multimedia content operation method and mobile terminal | |
CN108174109B (en) | Photographing method and mobile terminal | |
CN109710349B (en) | Screen capturing method and mobile terminal | |
CN108900695B (en) | Display processing method, terminal equipment and computer readable storage medium | |
CN110465080A (en) | Control method, apparatus, mobile terminal and the computer readable storage medium of vibration | |
CN108958593B (en) | Method for determining communication object and mobile terminal | |
CN108132749B (en) | Image editing method and mobile terminal | |
CN109005355B (en) | Shooting method and mobile terminal | |
CN109002245B (en) | Application interface operation method and mobile terminal | |
CN108881721B (en) | Display method and terminal | |
CN108924422B (en) | Panoramic photographing method and mobile terminal | |
CN108833791B (en) | Shooting method and device | |
CN108174110B (en) | Photographing method and flexible screen terminal | |
CN110457885B (en) | Operation method and electronic equipment | |
CN111031246A (en) | Shooting method and electronic equipment | |
CN109445589B (en) | Multimedia file playing control method and terminal equipment | |
CN109618055B (en) | Position sharing method and mobile terminal | |
CN109358913B (en) | Application program starting method and terminal equipment | |
CN108536513B (en) | Picture display direction adjusting method and mobile terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |