CN115278049A - Shooting method and device thereof - Google Patents

Shooting method and device thereof Download PDF

Info

Publication number
CN115278049A
CN115278049A CN202210684181.9A CN202210684181A CN115278049A CN 115278049 A CN115278049 A CN 115278049A CN 202210684181 A CN202210684181 A CN 202210684181A CN 115278049 A CN115278049 A CN 115278049A
Authority
CN
China
Prior art keywords
target
image
moment
plane
pose
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210684181.9A
Other languages
Chinese (zh)
Inventor
罗子扬
杨建平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202210684181.9A priority Critical patent/CN115278049A/en
Publication of CN115278049A publication Critical patent/CN115278049A/en
Priority to PCT/CN2023/099597 priority patent/WO2023241495A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The application provides a shooting method and a shooting device, and belongs to the technical field of image processing. The method comprises the following steps: receiving a first input, the first input for determining a target plane; and responding to the first input, controlling a camera of the electronic equipment to shoot by taking the target plane as a reference plane to obtain a target video, wherein in the target video, a target object is displayed at a first direction visual angle relative to the target plane, and the target object is an image of a shot object in the target plane.

Description

Shooting method and device thereof
Technical Field
The embodiment of the application relates to the technical field of image processing, in particular to a shooting method and a shooting device.
Background
In order to improve the shooting effect, the conventional electronic equipment is generally provided with a shooting anti-shake function, which can solve the problem of shaking of a shot picture caused by slight shaking of the electronic equipment in the shooting process.
However, the existing shooting anti-shake function cannot solve the problem that the shooting content is subject to perspective due to the change of the shooting position and the shooting angle of the shooting device, so that the shooting effect is poor.
Disclosure of Invention
The embodiment of the application provides a shooting method and a shooting device, which can solve the problem that the shooting effect is poor due to the fact that the shooting position and the shooting angle of shooting equipment are changed and the shooting content is perspective due to the fact that the existing shooting anti-shake function cannot be achieved.
In a first aspect, an embodiment of the present application provides a shooting method applied to an electronic device, including:
receiving a first input, the first input for determining a target plane;
and responding to the first input, and controlling a camera of the electronic equipment to shoot by taking the target plane as a reference plane to obtain a target video, wherein in the target video, a target object is displayed at a first direction view angle relative to the target plane, and the target object is an image of a shot object in the target plane.
In a second aspect, an embodiment of the present application further provides a shooting device applied to an electronic device, including:
a receiving module for receiving a first input, the first input being used to determine a target plane;
and the control module is used for responding to the first input and controlling a camera of the electronic equipment to shoot by taking the target plane as a reference plane to obtain a target video, wherein in the target video, a target object is displayed at a first direction view angle relative to the target plane, and the target object is an image of a shot object in the target plane.
In a third aspect, an embodiment of the present application further provides an electronic device, which includes a processor, a memory, and a program or an instruction stored in the memory and executable on the processor, and when the program or the instruction is executed by the processor, the shooting method according to the first aspect is implemented.
In a fourth aspect, the present application further provides a readable storage medium, on which a program or instructions are stored, and when the program or instructions are executed by a processor, the shooting method according to the first aspect is implemented.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In a sixth aspect, embodiments of the present application provide a computer program product, which is stored in a storage medium and executed by at least one processor to implement the method according to the first aspect.
In the embodiment of the application, after receiving a first input for determining a target plane, an electronic device may control a camera of the electronic device to shoot by using the target plane as a reference plane in response to the first input, so as to obtain a target video; wherein, in the target video, a target object is displayed at a first direction visual angle relative to the target plane, and the target object is an image of a shooting object in the target plane. Therefore, no matter how the shooting position and the shooting angle of the electronic equipment change in the shooting process, the imaging of the shooting object in the target plane in the video can be displayed at the first direction visual angle relative to the target plane, and the perspective cannot occur due to the change of the shooting position and the shooting angle of the electronic equipment, so that the shooting effect can be improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the description of the embodiments of the present application will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1a is a schematic view of a shooting angle of view provided by an embodiment of the present application;
FIG. 1b is a schematic diagram of a result of a shooting provided by an embodiment of the present application;
FIG. 2 is a flowchart of a shooting method provided in an embodiment of the present application;
FIG. 3a is a selected schematic view of a target plane provided by an embodiment of the present application;
FIG. 3b is a second selected schematic view of a target plane provided by an embodiment of the present application;
FIG. 4 is a schematic view of a rotating shaft provided in an embodiment of the present application;
fig. 5 is a schematic diagram of an included angle between the shooting device and a target plane according to an embodiment of the present application;
FIG. 6 is a schematic diagram of image rectification provided by an embodiment of the present application;
FIG. 7 is a schematic diagram of an image rectification result provided by an embodiment of the present application;
fig. 8 is a structural diagram of a shooting device provided in an embodiment of the present application;
fig. 9 is one of the structural diagrams of an electronic device provided in the embodiment of the present application;
fig. 10 is a second structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application without making any creative effort belong to the protection scope of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements, and are not intended to portray a particular order or sequence. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
For convenience of understanding, some contents related to the embodiments of the present application are described below:
the anti-shake types may include optical anti-shake (OIS), electronic anti-shake (EIS), and horizon anti-shake. Wherein the OIS is based on a hardware implementation. The EIS is realized based on picture cropping and a gyroscope, specifically, the EIS crops a picture, judges the motion state of a current machine based on gyroscope data, and stabilizes the picture by reverse compensation in a cropping range by combining the analysis result of the gyroscope data, so that a relatively stable shooting result is brought to a user. The horizon anti-shake is that on the basis of the electronic anti-shake technology, the gravity sensor data is added to keep the picture on the positive direction of the horizon all the time.
The software scheme EIS and the hardware scheme optical anti-shake (OIS) have mature anti-shake schemes, and also have mature software and hardware combined O + E schemes, namely on the basis of OIS lens anti-shake, the OIS transmits compensated information and lens position change to the EIS in real time, and on the basis of the EIS, the EIS calculates gyroscope and OIS data and then performs software scheme anti-shake processing, so that a shot video can be relatively stable and smooth.
However, the current anti-shake scheme cannot be used for shooting a user, and the content of a shooting plane is perspective and distorted due to the change of the position and the shooting angle. The following are exemplary: when a user shoots a plane, such as a screen, a two-dimensional code, an infinite view, etc., if the user is forced to be limited by an actual field, or due to factors such as shaking, etc., the shooting angle and position of the shooting device are changed, as shown in fig. 1 a. However, the shooting position and angle of the shooting device are changed, which causes the problem of perspective distortion of the shooting plane, as shown in fig. 1b, corresponding to the images of the shooting plane shot at the left, right and front sides of the upper figure.
Based on this, the embodiment of the application provides an improvement scheme for the problem that the perspective and distortion of the shooting plane occur due to the change of the angle and the position of the shooting equipment during the video recording. The embodiment of the application can control the shooting device to shoot by taking the shooting plane as a reference plane by determining the shooting plane (hereinafter referred to as a target plane), so that the shooting plane is displayed in a first-direction angle of view in a video, namely, the shooting plane is always kept in a shooting state in the first-direction angle of view. Therefore, the user does not need to always keep the shooting equipment absolutely stable and the shooting position consistent in the shooting process, and a stable shooting result without perspective distortion can be obtained, so that the shooting effect can be improved. The embodiment of the application can enable a user to obtain the constant optimal view of the first direction visual angle when shooting the shooting plane at different angles and positions.
In the embodiment of the present application, the first directional viewing angle may be: the front view angle, the rear view angle, the left view angle, the right view angle, the upper view angle or the lower view angle can be determined according to actual requirements, and the embodiment of the application does not limit the determination. The front and rear viewing angles may be collectively referred to as a forward direction viewing angle, and the other viewing angles may be collectively referred to as a sideways direction viewing angle. The positive direction of the target plane can be understood as: the vertical direction of the target plane.
The shooting method of the embodiment of the application can be applied to or executed by electronic equipment, and the electronic equipment is the shooting equipment. In practical applications, the electronic Device may be a terminal-side Device such as a Mobile phone, a Tablet Personal Computer (Tablet Personal Computer), a Laptop Computer (Laptop Personal Computer) or a terminal-side Device called a notebook Computer, a Personal Digital Assistant (PDA), a palmtop Computer, a netbook, an Ultra-Mobile Personal Computer (UMPC), a Mobile Internet Device (MID), an Augmented Reality (AR)/Virtual Reality (VR) Device, a robot, a Wearable Device (Wearable Device), a vehicle-mounted Device (car e), a pedestrian terminal (PUE), a smart home (a home Device with a wireless communication function, such as a refrigerator, a television, a washing machine, or furniture).
The shooting method provided by the embodiment of the present application is described in detail below with reference to the accompanying drawings by some embodiments and application scenarios thereof.
Referring to fig. 2, fig. 2 is a flowchart of a shooting method provided in an embodiment of the present application. As shown in fig. 2, the photographing method may include the steps of:
step 201, receiving a first input, where the first input is used to determine a target plane.
The target plane is a shooting reference plane, namely in the shooting process, the shooting picture is shot by taking the target plane as the reference plane, so that the imaging of the shot object in the target plane is always presented in the shot video at a first direction visual angle relative to the target plane, the imaging of the shot object in the target plane cannot have the perspective problem, and the imaging effect of the shot object in the target plane can be improved.
The first input may be any input that can determine a target plane, such as: in a first implementation manner, the first input may be: an input for a target plane is selected from a plurality of planes provided by the electronic device. In a second implementation manner, the first input may be: in this implementation, the electronic device may determine, as the target plane, a plane in which the shooting object corresponding to the target object is located. Compared with the first implementation mode that only the target plane can be selected from the preset planes, the second implementation mode can automatically calibrate the target plane by a user, so that the flexibility of determining the target plane can be improved.
For the second implementation manner, the first input may be specifically expressed as a long press input or a double-click input for the target object, or the like; alternatively, the first input may be embodied as: selecting an input of N points of the target object, N may be an integer greater than or equal to 4.
It should be noted that, at the input time of the first input, in an implementation manner, the electronic device may be in a first-direction angle-of-view shooting state of the target plane, that is, the electronic device shoots at a first-direction angle of view relative to the target plane. In another implementation manner, the electronic device may also be in a shooting state at a direction angle of view (hereinafter referred to as a second direction angle of view) other than the first direction angle of view of the object plane, that is, the electronic device shoots at the second direction angle of view relative to the object plane. Taking a first direction viewing angle as a positive direction viewing angle as an example, at the input time of the first input, as shown in fig. 3a, the electronic device may be in a shooting state at the positive direction viewing angle of the target plane; as shown in fig. 3b, the electronic device may also be in a side view shooting state of the object plane. That is, regardless of whether the electronic device is in the first-direction angle-of-view photographing state of the target plane, the electronic device may accurately determine the target plane in response to the first input.
Step 202, in response to the first input, controlling a camera of the electronic device to shoot by using the target plane as a reference plane, so as to obtain a target video, wherein in the target video, a target object is displayed at a first directional angle of view relative to the target plane, and the target object is an image of a shot object in the target plane.
The target plane is taken as a reference plane for shooting, and the following conditions can be met: the imaging of the photographic subject in the target plane is always presented in the captured video at a first directional perspective relative to the target plane. Therefore, no matter how the shooting angle or the shooting position changes, the imaging of the shot object in the target plane can be ensured not to have the perspective problem, and the imaging effect of the shot object in the target plane can be improved.
In the shooting method of this embodiment, after receiving a first input for determining a target plane, an electronic device may control a camera of the electronic device to shoot with the target plane as a reference plane in response to the first input, so as to obtain a target video; wherein, in the target video, a target object is displayed at a first direction visual angle relative to the target plane, and the target object is an image of a shooting object in the target plane. Therefore, no matter how the shooting position and the shooting angle of the electronic equipment change in the shooting process, the imaging of the shooting object in the target plane in the video can be displayed at the first direction visual angle relative to the target plane, and the perspective cannot occur due to the change of the shooting position and the shooting angle of the electronic equipment, so that the shooting effect can be improved.
For the second implementation manner, in some embodiments, the controlling, in response to the first input, a camera of the electronic device to shoot with the target plane as a reference plane to obtain a target video may include:
determining first information in response to the first input, wherein the first information is a first pose or a first included angle, and the first pose is a pose when the electronic equipment shoots at a first direction visual angle relative to the target plane; a first included angle is an included angle between the electronic equipment and the target plane at a first moment, and the first moment is the input moment of the first input;
and transforming the image collected by the camera according to the first information to obtain a target video.
According to the content, when the user shoots the preview interface to calibrate the target plane, the electronic equipment may be in a first direction view angle shooting state of the target plane; or the electronic device may be in a second orientation perspective capture state of the object plane.
The electronic device can determine the shooting state of the electronic device at the first moment by detecting whether perspective distortion occurs to the target object in the shooting preview interface when the target object is shot in a first direction view angle shooting state compared with the electronic device.
In the case that the target object is not subjected to perspective distortion in the shooting preview interface compared with the target object shot by the electronic device in the first-direction angle-of-view shooting state, it can be determined that the electronic device is in the first-direction shooting state of the target plane. In this case, the current pose of the electronic device may be directly determined as the first pose, and the current included angle between the electronic device and the target plane, that is, the first included angle is 0.
In the case that the target object in the shooting preview interface is perspective distorted compared with the target object shot by the electronic device in the first direction view angle shooting state, it can be determined that the electronic device is in the second direction shooting state of the target plane. In this case, the first pose and the first included angle may be derived based on the perspective distortion degree of the target object in the shooting preview interface and the current pose of the electronic device.
And then, the image acquired by the camera can be transformed by using the acquired first posture or first included angle to obtain a target video. The image transformation can be understood as: a perspective transformation of the target object, comprising: and generating a corresponding perspective distortion grid, and performing perspective correction on the image according to pixels on the grid moving image, so that a shooting object in the target plane always keeps the positive direction shooting visual angle of the target plane in the image for display, and the imaging effect of the target plane can be improved.
According to the embodiment, the electronic device can correct the collected image including the shooting object in the target plane according to the pose of the electronic device when shooting is carried out at the first direction visual angle relative to the target plane, or the included angle between the electronic device and the target plane at the input moment of the first input, so that the image is displayed at the first direction visual angle relative to the target plane, and the imaging effect of the target plane can be improved.
The specific implementation of the image transformation may be different for different representations of the first information, as described in detail below.
And in case one, the first information is the first pose.
In this case, in some embodiments, the transforming, according to the first information, the image acquired by the camera to obtain the target video may include:
acquiring a second image and a second pose, wherein the second image is an image which is acquired by a camera of the electronic equipment at a third moment and comprises the target object, the second pose is a pose of the electronic equipment at the third moment, and the third moment is behind the first moment;
determining a first rotation matrix according to the first pose and the second pose;
adjusting the target object in the second image according to the first rotation matrix, wherein the adjusted target object in the second image is displayed at a first directional viewing angle relative to the target plane;
wherein the target video comprises the adjusted second image.
The second image may be any image including the target object captured by a camera of the electronic device after the target plane is selected. The second pose is a pose when the electronic device acquires the second image, and can be obtained by a correlation technique, which is not described here.
In this embodiment, the electronic device may calculate a first rotation matrix based on the first pose and the second pose, where the first rotation matrix is a three-dimensional matrix. The first pose, the second pose, and the first rotation matrix satisfy: the first position can be obtained by rotating the second position through the first rotation matrix.
After the first rotation matrix is obtained, the target object in the second image may be transformed by using the first rotation matrix based on a conversion principle between a three-dimensional world coordinate and a two-dimensional pixel coordinate, so as to obtain the target image of the first-direction shooting view angle of the target plane.
By the embodiment, the electronic equipment can utilize the pose of the electronic equipment when shooting is carried out at the first direction visual angle relative to the target plane and the pose of the electronic equipment when acquiring the second image, and the second image is transformed through three-dimensional coordinate transformation, so that the target object in the second image is displayed at the first direction visual angle relative to the target plane, an ideal target plane view is obtained, and the shooting reliability can be improved.
And in case two, the first information is the first included angle.
In this case, in some embodiments, the transforming the image acquired by the camera according to the first information to obtain the target video includes:
acquiring a third image and a first angle change value, wherein the third image is an image which is acquired by a camera of the electronic equipment at a third moment and comprises the target object; the first angle change value is an angle change value of the electronic equipment at a third moment relative to the first moment, and the third moment is positioned after the first moment;
determining a second included angle according to the first included angle and the first angle change value, wherein the second included angle is an included angle between the electronic equipment and the target plane at the third moment;
determining a second rotation matrix according to the second included angle;
adjusting the target object in the third image according to the second rotation matrix, wherein the adjusted target object in the third image is displayed at a first directional viewing angle relative to the target plane;
wherein the target video comprises the adjusted third image.
The third image may be any image including the target object captured by a camera of the electronic device after the target plane is selected.
The first angle change value may be acquired through data of a gyroscope of the electronic device. During specific implementation, the angular velocity of each axis at the current moment can be acquired through gyroscope data, and the angular variation of the equipment in unit time length can be acquired through integrating the angular velocity. The angular change in the attitude of the device over the time interval is available for each set of gyroscope data. By adding up the change of the angle in each time interval, the change of the angle in the period from the first moment to the third moment can be obtained. And then, the second included angle can be obtained by superposing the first included angle and the first angle change value.
After the second included angle is obtained, a three-dimensional matrix corresponding to the second included angle, that is, the second conversion matrix, may be determined. Then, based on a conversion principle between three-dimensional world coordinates and two-dimensional pixel coordinates, the target object in the third image may be transformed by using a second rotation matrix, so as to obtain the target image of the first-direction shooting angle of view of the target plane.
The three-dimensional matrix corresponding to the third included angle may be obtained by, but not limited to, conversion based on a conversion relationship between the included angle and the three-dimensional matrix, where the conversion relationship may be obtained by, but not limited to, the following method: acquiring a pose when the electronic equipment acquires the third image, recording the pose as a third pose, and calculating by the electronic equipment to obtain a three-dimensional rotation matrix based on the third pose and the first pose; then, the electronic device may determine a conversion relationship between the included angle and the three-dimensional matrix based on the three-dimensional rotation matrix and the first included angle.
According to the embodiment, the included angle between the electronic equipment and the target plane at the third moment can be obtained by utilizing the included angle between the electronic equipment and the target plane when the target plane is calibrated and the angle change value between the calibration moment and the third moment of the electronic equipment, and then the included angle is utilized to carry out transformation processing on the third image through three-dimensional coordinate transformation, so that the target object in the third image is displayed at a first direction visual angle relative to the target plane, an ideal target plane view is obtained, and the shooting reliability can be improved.
The following describes the acquisition of the first information in a scene where the electronic device is in the second-direction angle-of-view shooting state of the target plane when the user calibrates the target plane.
In some embodiments, said determining first information in response to said first input may comprise:
acquiring a first coordinate of a target point of the target object in the shooting preview interface;
determining a third rotation matrix according to the first coordinate and a second coordinate, wherein the second coordinate is a coordinate of the target point when the electronic equipment shoots the target plane at a first direction visual angle relative to the target plane;
and determining first information according to an initial pose and the third rotation matrix, wherein the initial matrix is the pose of the electronic equipment at the first moment.
In this embodiment, the electronic device may acquire coordinates of each point of the target object obtained by shooting the target plane when the electronic device is in the first-direction angle-of-view shooting state of the target plane in advance. In this way, after acquiring coordinates of each point of the target object in the shooting preview interface, the electronic device may determine whether perspective distortion occurs in the shooting preview interface when the target object is shot in a first direction view angle shooting state compared with the target object shot by the electronic device.
Then, a three-dimensional rotation matrix, i.e. a third rotation matrix, can be obtained by converting the three-dimensional world coordinate and the two-dimensional pixel coordinate by using a certain point with perspective distortion, i.e. the coordinate after the target point is subjected to perspective, i.e. the first coordinate, and the coordinate after perspective correction, i.e. the second coordinate.
For ease of understanding, assume that the first direction is the positive direction, please continue to refer to fig. 3a and 3b. Let coordinate of point B in FIG. 3a be x1The coordinate of point B in FIG. 3B is x2And the coordinate of the B point in world coordinates is X. x is a radical of a fluorine atom1The relationship to X can be expressed as:
x1=K1×R(t1)×X; (1)
the relationship of X2 to X can be expressed as:
x2=K2×R(t2)×X (2)
combining the above two formulas to obtain x1And x2The conversion relationship between:
x2=K2×R(t2)/K1×R(t1) (3)
wherein, KiRepresents tiParameters of a time camera; r (t)i) Represents tiA three-dimensional rotation matrix of time; i is 1 or 2. In the embodiment of the application, the shooting state of the first direction visual angle of the target plane is taken as a reference, R (t)1) Can be considered as an identity matrix.
In the present embodiment, x1、x2、K2、K1And R (t)1) It is known that R (t) can be obtained by substituting the above formula2) I.e. the third rotation matrix described above.
In case one and case two above, x2、K2、K1、R(t1) And R (t)2) It is known that x can be obtained based on the above formula1And realizing the transformation of the target object to obtain the target object of the positive direction shooting visual angle of the target plane.
After the third rotation matrix is obtained, the electronic device may convert the initial pose by using the third rotation matrix to obtain the first pose. Further, an angle between the initial pose and the first pose may be determined as the first angle.
By the embodiment, the three-dimensional conversion matrix at the moment can be obtained by converting the three-dimensional world coordinate and the two-dimensional pixel coordinate through the coordinate of the point with perspective distortion in the target object, and then the first information can be obtained, so that the reliability of image processing can be improved, and the shooting effect is improved.
It should be noted that, in the embodiments of the present application, various optional implementations that are described in the embodiments may be implemented in combination with each other or separately without conflicting with each other, and the embodiments of the present application are not limited to this.
For ease of understanding, the examples are illustrated below:
in this example, the shooting method according to the embodiment of the present application is implemented in combination with the horizon anti-shake technology, and at this time, the shooting method according to the embodiment of the present application may be understood as a plane-locking shooting method based on the horizon anti-shake technology. Of course, it is understood that, in other embodiments, the shooting method and the horizon anti-shake technology in the embodiment of the present application may be implemented independently, which may be determined according to actual requirements, and the embodiment of the present application does not limit this.
In addition, in this example, the first direction is taken as an example for explanation, but the expression form of the first direction is not limited to this, and in other examples, the first direction may be taken as another direction, but the principle of the imaging is the same, and the user can obtain an optimal view of a constant angle of view in the first direction when the user images the imaging plane at different angles and positions.
The photographing method may include the steps of:
the method comprises the following steps: the user can pre-select the target plane to be photographed on the device.
As shown in fig. 3a and 3b, the user may select the ABCD four points and determine the plane. In practice, a plane can be identified by selecting at least 4 points, and the positive direction of the plane can be determined. The planar area can then be continuously tracked by image recognition.
It is worth noting that when the user marks the target area, the user is only required to accurately select the plane area without being required to be exactly positioned at the shooting position in the positive direction.
The angle difference between the current shooting time of the equipment and the target plane can be calculated according to the perspective distortion angle of the current plane and the difference between the angle of the positive direction plane, so that the equipment posture corresponding to the positive direction visual angle is obtained.
The algorithm correlation angle calculation is illustrated as follows:
consistent with the principle of EIS algorithm, the transformation calculation of three-dimensional world coordinate-two-dimensional pixel coordinate can be carried out by the coordinate of the B point after perspective and the coordinate of the B point after perspective correction, and the included angle between the equipment in the initial state and the positive direction is obtained.
Assuming that B in fig. 3a is x1, based on the state, the rotation matrix at this time can be calculated by substituting the identity matrix; b in fig. 3B is x2, and the rotation matrix of fig. 3B can be obtained by substituting the following formula.
x2=K2×R(t2)/K1×R(t1)
When the horizon anti-shake mode is combined, the obtained rotation matrix can only contain the angle change of X (or Y) in a single direction without considering the angles of Z and X (or Y direction), namely the included angle between the equipment and the shooting plane in the initial state.
The target plane can also be calibrated in a simpler and more convenient mode, namely the calibration is carried out at the position shot in the positive direction by the user, and at the moment, the included angle between the equipment and the shooting plane is 0 without additional calculation.
Step two:
after the posture of the equipment in the positive direction is obtained, the image can be transformed (perspective transformation is carried out to generate a corresponding perspective distortion grid, the image is subjected to perspective correction according to pixels on a grid moving image), an imaging plane in the positive direction of the equipment is obtained theoretically, the plane at the moment is an ideal positive direction imaging plane, and in subsequent shooting, the plane is maintained to be imaged, even if the position and the shooting angle of the equipment are changed.
Based on the horizon anti-shake technology, the imaging plane can be ensured to be always in the positive direction of the horizon, and the rotation of a rotating shaft (ROLL shaft) of the equipment in the shooting process is not considered. And the shake in the Roll axis direction does not bring perspective distortion to the shooting plane, in this example, as shown in fig. 4, only the perspective distortion caused by the shake picture of the nodding axis (PITCH axis, vertical axis of rotation) or the panning axis (YAW axis, horizontal axis of rotation) may need to be considered.
Step three:
the real-time angle between the current equipment and the target shooting plane can be simulated by acquiring gyroscope data, and the method comprises the following specific steps:
the angular velocity of each axis of the equipment at the current moment can be obtained through gyroscope data, and the angular variation of the equipment in unit time can be obtained through integrating the angular velocity. The angular change in the attitude of the device over each set of gyroscope data can be obtained over a time interval. By accumulating the angle changes in each time interval in this step, the angle changes in the period from the initial time to the current time can be obtained.
The angle between the real-time device and the target plane can be obtained by superimposing the angle between the device in the initial state and the target plane (attached in step one), as shown in fig. 5.
Step four:
based on the angle, the degree of perspective distortion of the current image frame relative to the forward direction imaging frame can be calculated by combining with an EIS anti-shake algorithm and through three-dimensional coordinate conversion (attached in the step one), so that the distortion quantity to be corrected of the current frame is obtained, and the image is transformed to obtain an ideal target plane view, as shown in FIG. 6.
Step five:
this solution requires a frame cropping, when the frame deflection angle is too large, a situation may occur in which the target plane (the user pre-selected area) cannot be fully presented in the frame. This does not affect the calculation and distortion correction processing of the present solution. The algorithm will present the object plane, still within the frame after processing, in the frame in a positive viewing angle, as shown in fig. 7.
In the above, the scheme of locking the target plane shooting can be realized. The scheme is not limited to video shooting, and can also be used for shooting previews and other scenes needing to use continuous frame images to lock the same shooting plane.
In the shooting method provided by the embodiment of the present application, the execution subject may be a shooting device, or a control module in the shooting device for executing the shooting method. The embodiment of the present application takes an example in which a shooting device executes a shooting method, and the shooting device provided in the embodiment of the present application is described.
Referring to fig. 8, fig. 8 is a structural diagram of a camera according to an embodiment of the present disclosure.
As shown in fig. 8, the photographing apparatus 800 includes:
a receiving module 801, configured to receive a first input, where the first input is used to determine a target plane;
a control module 802, configured to, in response to the first input, control a camera of the electronic device to take a picture with the target plane as a reference plane, so as to obtain a target video, where in the target video, a target object is displayed at an angle of view in a first direction relative to the target plane, and the target object is an image of a picture object in the target plane.
In some embodiments, the first input is: input for the target object in a shooting preview interface;
the control module includes:
the determining submodule is used for responding to the first input and determining first information, wherein the first information is a first pose or a first included angle, and the first pose is a pose when the electronic equipment shoots at a first direction visual angle relative to the target plane; a first included angle is an included angle between the electronic equipment and the target plane at a first moment, and the first moment is the input moment of the first input;
and the transformation submodule is used for transforming the image acquired by the camera according to the first information to obtain a target video.
In some embodiments, in the case that the first information is the first pose, the transformation submodule includes:
the first acquisition unit is used for acquiring a second image and a second pose, wherein the second image is an image which is acquired by a camera of the electronic equipment at a third moment and comprises the target object, the second pose is a pose of the electronic equipment at the third moment, and the third moment is behind the first moment;
a first determination unit, configured to determine a first rotation matrix according to the first pose and the second pose;
a first adjusting unit, configured to adjust the target object in the second image according to the first rotation matrix, where the adjusted target object in the second image is displayed at a first directional viewing angle relative to the target plane;
wherein the target video comprises the adjusted second image.
In some embodiments, in the case that the first information is the first included angle, the transformation submodule includes:
the second obtaining unit is used for obtaining a third image and a first angle change value, wherein the third image is an image which is acquired by a camera of the electronic equipment at a third moment and comprises the target object; the first angle change value is an angle change value of the electronic equipment at a third moment relative to the first moment, and the third moment is positioned after the first moment;
a second determining unit, configured to determine a second included angle according to the first included angle and the first angle variation value, where the second included angle is an included angle between the electronic device and the target plane at the third time;
a third determining unit, configured to determine a second rotation matrix according to the second included angle;
a second adjusting unit, configured to adjust the target object in the third image according to the second rotation matrix, where the adjusted target object in the third image is displayed at a first directional viewing angle relative to the target plane;
wherein the target video comprises the adjusted third image.
In some embodiments, the determining sub-module comprises:
a third obtaining unit, configured to obtain a first coordinate of a target point of the target object in the shooting preview interface;
a fourth determining unit, configured to determine a third rotation matrix according to the first coordinate and a second coordinate, where the second coordinate is a coordinate of the target point when the electronic device photographs the target plane at a first directional angle of view relative to the target plane;
a fifth determining unit, configured to determine first information according to an initial pose and the third rotation matrix, where the initial matrix is the pose of the electronic device at the first time.
The shooting device in the embodiment of the present application may be a device, and may also be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiment of the present application is not particularly limited.
The photographing apparatus in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The shooting device provided in the embodiment of the present application can implement each process in the method embodiment of fig. 2, and is not described here again to avoid repetition.
Optionally, as shown in fig. 9, an electronic device 900 is further provided in this embodiment of the present application, and includes a processor 901, a memory 902, and a program or an instruction stored in the memory 902 and executable on the processor 901, where the program or the instruction is executed by the processor 901 to implement each process of the foregoing shooting method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
It should be noted that the electronic device in the embodiment of the present application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 10 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 1000 includes, but is not limited to: a radio frequency unit 1001, a network module 1002, an audio output unit 1003, an input unit 1004, a sensor 1005, a display unit 1006, a user input unit 1007, an interface unit 1008, a memory 1009, and a processor 1010.
Those skilled in the art will appreciate that the electronic device 1000 may further comprise a power supply (e.g., a battery) for supplying power to various components, and the power supply may be logically connected to the processor 1100 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system. The electronic device structure shown in fig. 10 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is not repeated here.
The input unit 1004 is configured to receive a first input, where the first input is used to determine a target plane;
a processor 1010, configured to, in response to the first input, control a camera of the electronic device to take a picture with the target plane as a reference plane, so as to obtain a target video, where in the target video, a target object is displayed at an angle of view in a first direction relative to the target plane, and the target object is an image of a picture taken in the target plane.
The shooting device provided in the embodiment of the present application can implement each process in the method embodiment of fig. 2, and is not described here again to avoid repetition.
It should be understood that in the embodiment of the present application, the input Unit 1004 may include a Graphics Processing Unit (GPU) 10041 and a microphone 10042, and the Graphics Processing Unit 10041 processes image data of still pictures or videos obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 1006 may include a display panel 10061, and the display panel 10061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 1007 includes a touch panel 10071 and other input devices 10072. The touch panel 10071 is also referred to as a touch screen. The touch panel 10071 may include two parts, a touch detection device and a touch controller. Other input devices 10072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. The memory 1009 may be used to store software programs as well as various data, including but not limited to application programs and operating systems. Processor 1010 may integrate an application processor that handles primarily operating systems, user interfaces, applications, etc. and a modem processor that handles primarily wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 1010.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the above shooting method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or an instruction to implement each process of the above shooting method embodiment, and can achieve the same technical effect, and the details are not repeated in order to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising a … …" does not exclude the presence of another identical element in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. A photographing method, characterized by comprising:
receiving a first input, the first input being used to determine a target plane;
and responding to the first input, controlling a camera of the electronic equipment to shoot by taking the target plane as a reference plane to obtain a target video, wherein in the target video, a target object is displayed at a first direction visual angle relative to the target plane, and the target object is an image of a shot object in the target plane.
2. The method of claim 1, wherein the first input is: input for the target object in a shooting preview interface;
the responding to the first input, controlling a camera of the electronic device to shoot by taking the target plane as a reference plane to obtain a target video, including:
determining first information in response to the first input, wherein the first information is a first pose or a first included angle, and the first pose is a pose of the electronic equipment when the electronic equipment shoots at a first direction visual angle relative to the target plane; a first included angle is an included angle between the electronic equipment and the target plane at a first moment, and the first moment is the input moment of the first input;
and transforming the image collected by the camera according to the first information to obtain a target video.
3. The method according to claim 2, wherein in a case that the first information is the first pose, transforming the image acquired by the camera according to the first information to obtain a target video includes:
acquiring a second image and a second pose, wherein the second image is an image which is acquired by a camera of the electronic equipment at a third moment and comprises the target object, the second pose is the pose of the electronic equipment at the third moment, and the third moment is behind the first moment;
determining a first rotation matrix according to the first pose and the second pose;
adjusting the target object in the second image according to the first rotation matrix, wherein the adjusted target object in the second image is displayed at a first directional viewing angle relative to the target plane;
wherein the target video comprises the adjusted second image.
4. The method according to claim 2, wherein, when the first information is the first included angle, transforming the image acquired by the camera according to the first information to obtain a target video includes:
acquiring a third image and a first angle change value, wherein the third image is an image which is acquired by a camera of the electronic equipment at a third moment and comprises the target object; the first angle change value is an angle change value of the electronic equipment at a third moment relative to the first moment, and the third moment is positioned after the first moment;
determining a second included angle according to the first included angle and the first angle change value, wherein the second included angle is an included angle between the electronic equipment and the target plane at the third moment;
determining a second rotation matrix according to the second included angle;
adjusting the target object in the third image according to the second rotation matrix, wherein the adjusted target object in the third image is displayed at a first directional viewing angle relative to the target plane;
wherein the target video comprises the adjusted third image.
5. The method of claim 2, wherein determining first information in response to the first input comprises:
acquiring a first coordinate of a target point of the target object in the shooting preview interface;
determining a third rotation matrix according to the first coordinate and a second coordinate, wherein the second coordinate is a coordinate of the target point when the electronic equipment shoots the target plane at a first direction visual angle relative to the target plane;
and determining first information according to an initial pose and the third rotation matrix, wherein the initial matrix is the pose of the electronic equipment at the first moment.
6. A shooting device applied to electronic equipment is characterized by comprising:
a receiving module for receiving a first input, the first input being used to determine a target plane;
and the control module is used for responding to the first input and controlling a camera of the electronic equipment to shoot by taking the target plane as a reference plane to obtain a target video, wherein in the target video, a target object is displayed at a first direction visual angle relative to the target plane, and the target object is an image of a shot object in the target plane.
7. The apparatus of claim 6, wherein the first input is: input for the target object in a shooting preview interface;
the control module includes:
the determining submodule is used for responding to the first input and determining first information, wherein the first information is a first pose or a first included angle, and the first pose is a pose when the electronic equipment shoots at a first direction visual angle relative to the target plane; a first included angle is an included angle between the electronic equipment and the target plane at a first moment, and the first moment is the input moment of the first input;
and the transformation submodule is used for transforming the image acquired by the camera according to the first information to obtain a target video.
8. The apparatus of claim 7, wherein in the case that the first information is the first pose, the transformation submodule comprises:
the first acquisition unit is used for acquiring a second image and a second pose, wherein the second image is an image which is acquired by a camera of the electronic equipment at a third moment and comprises the target object, the second pose is a pose of the electronic equipment at the third moment, and the third moment is behind the first moment;
a first determination unit, configured to determine a first rotation matrix according to the first pose and the second pose;
a first adjusting unit, configured to adjust the target object in the second image according to the first rotation matrix, where the adjusted target object in the second image is displayed at a first directional viewing angle relative to the target plane;
wherein the target video comprises the adjusted second image.
9. The apparatus of claim 7, wherein in the case that the first information is the first included angle, the transformation submodule includes:
the second obtaining unit is used for obtaining a third image and a first angle change value, wherein the third image is an image which is acquired by a camera of the electronic equipment at a third moment and comprises the target object; the first angle change value is an angle change value of the electronic equipment at the third moment relative to the first moment, and the third moment is located after the first moment;
a second determining unit, configured to determine a second included angle according to the first included angle and the first angle variation value, where the second included angle is an included angle between the electronic device and the target plane at the third time;
a third determining unit, configured to determine a second rotation matrix according to the second included angle;
a second adjusting unit, configured to adjust the target object in the third image according to the second rotation matrix, where the adjusted target object in the third image is displayed at a first directional viewing angle relative to the target plane;
wherein the target video comprises the adjusted third image.
10. The apparatus of claim 7, wherein the determining sub-module comprises:
a third obtaining unit, configured to obtain a first coordinate of a target point of the target object in the shooting preview interface;
a fourth determining unit, configured to determine a third rotation matrix according to the first coordinate and a second coordinate, where the second coordinate is a coordinate of the target point when the electronic device photographs the target plane at a first directional angle of view relative to the target plane;
a fifth determining unit, configured to determine first information according to an initial pose and the third rotation matrix, where the initial matrix is the pose of the electronic device at the first time.
CN202210684181.9A 2022-06-17 2022-06-17 Shooting method and device thereof Pending CN115278049A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210684181.9A CN115278049A (en) 2022-06-17 2022-06-17 Shooting method and device thereof
PCT/CN2023/099597 WO2023241495A1 (en) 2022-06-17 2023-06-12 Photographic method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210684181.9A CN115278049A (en) 2022-06-17 2022-06-17 Shooting method and device thereof

Publications (1)

Publication Number Publication Date
CN115278049A true CN115278049A (en) 2022-11-01

Family

ID=83761167

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210684181.9A Pending CN115278049A (en) 2022-06-17 2022-06-17 Shooting method and device thereof

Country Status (2)

Country Link
CN (1) CN115278049A (en)
WO (1) WO2023241495A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023241495A1 (en) * 2022-06-17 2023-12-21 维沃移动通信有限公司 Photographic method and apparatus

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113873148A (en) * 2021-09-14 2021-12-31 维沃移动通信(杭州)有限公司 Video recording method, video recording device, electronic equipment and readable storage medium
CN114007054A (en) * 2022-01-04 2022-02-01 宁波均联智行科技股份有限公司 Method and device for correcting projection of vehicle-mounted screen picture

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3866600B2 (en) * 2002-03-27 2007-01-10 株式会社東芝 Image processing apparatus and image processing method
CN107809594B (en) * 2017-11-10 2019-09-27 维沃移动通信有限公司 A kind of image pickup method and mobile terminal
CN111047536B (en) * 2019-12-18 2023-11-14 深圳市汉森软件股份有限公司 CCD image correction method, device, equipment and storage medium
CN112927306B (en) * 2021-02-24 2024-01-16 深圳市优必选科技股份有限公司 Calibration method and device of shooting device and terminal equipment
CN115278049A (en) * 2022-06-17 2022-11-01 维沃移动通信有限公司 Shooting method and device thereof

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113873148A (en) * 2021-09-14 2021-12-31 维沃移动通信(杭州)有限公司 Video recording method, video recording device, electronic equipment and readable storage medium
CN114007054A (en) * 2022-01-04 2022-02-01 宁波均联智行科技股份有限公司 Method and device for correcting projection of vehicle-mounted screen picture

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023241495A1 (en) * 2022-06-17 2023-12-21 维沃移动通信有限公司 Photographic method and apparatus

Also Published As

Publication number Publication date
WO2023241495A1 (en) 2023-12-21

Similar Documents

Publication Publication Date Title
Lai et al. Semantic-driven generation of hyperlapse from 360 degree video
CN103873758B (en) The method, apparatus and equipment that panorama sketch generates in real time
Guilluy et al. Video stabilization: Overview, challenges and perspectives
JP6090786B2 (en) Background difference extraction apparatus and background difference extraction method
JP2015521419A (en) A system for mixing or synthesizing computer generated 3D objects and video feeds from film cameras in real time
CN113556464B (en) Shooting method and device and electronic equipment
CN114339102B (en) Video recording method and equipment
CN112637500B (en) Image processing method and device
US10764493B2 (en) Display method and electronic device
WO2017112800A1 (en) Macro image stabilization method, system and devices
WO2023241495A1 (en) Photographic method and apparatus
US20090059018A1 (en) Navigation assisted mosaic photography
CN111402136A (en) Panorama generation method and device, computer readable storage medium and electronic equipment
CN114390186A (en) Video shooting method and electronic equipment
CN115589532A (en) Anti-shake processing method and device, electronic equipment and readable storage medium
Chew et al. Panorama stitching using overlap area weighted image plane projection and dynamic programming for visual localization
CN113891005B (en) Shooting method and device and electronic equipment
CN114241127A (en) Panoramic image generation method and device, electronic equipment and medium
CN114095659B (en) Video anti-shake method, device, equipment and storage medium
US20150022559A1 (en) Method and apparatus for displaying images in portable terminal
Gu et al. Real-time image mosaicing system using a high-frame-rate video sequence
CN112261262B (en) Image calibration method and device, electronic equipment and readable storage medium
CN114785957A (en) Shooting method and device thereof
CN113888435A (en) Image processing method, image processing device, electronic equipment and readable storage medium
Shi et al. A Review and Comparison on Video Stabilization Alorithms

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination