CN108200335B - Photographing method based on double cameras, terminal and computer readable storage medium - Google Patents

Photographing method based on double cameras, terminal and computer readable storage medium Download PDF

Info

Publication number
CN108200335B
CN108200335B CN201711468892.8A CN201711468892A CN108200335B CN 108200335 B CN108200335 B CN 108200335B CN 201711468892 A CN201711468892 A CN 201711468892A CN 108200335 B CN108200335 B CN 108200335B
Authority
CN
China
Prior art keywords
shooting
target object
value
camera
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711468892.8A
Other languages
Chinese (zh)
Other versions
CN108200335A (en
Inventor
陈正言
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Microphone Holdings Co Ltd
Original Assignee
Shenzhen Jinli Communication Equipment Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Jinli Communication Equipment Co Ltd filed Critical Shenzhen Jinli Communication Equipment Co Ltd
Priority to CN201711468892.8A priority Critical patent/CN108200335B/en
Publication of CN108200335A publication Critical patent/CN108200335A/en
Application granted granted Critical
Publication of CN108200335B publication Critical patent/CN108200335B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/52Details of telephonic subscriber devices including functional features of a camera

Abstract

The embodiment of the invention discloses a photographing method based on double cameras, a terminal and a computer readable storage medium, wherein the method comprises the following steps: acquiring a preview image through a first camera and a second camera; acquiring a first distance value between a first camera and a second camera in the vertical direction, acquiring a first visual angle value corresponding to the first camera and acquiring a second visual angle value corresponding to the second camera; determining a target shooting distance value between a target object and the terminal according to the first distance value, the first visual angle value and the second visual angle value; acquiring a shooting parameter corresponding to the target distance value according to a preset corresponding relation between the shooting distance value and the shooting parameter; and if the shooting instruction is detected, shooting the picture according to the shooting parameters. According to the embodiment of the invention, the focal length of the camera is accurately adjusted through the target shooting distance value, so that a clear image of a target object is obtained through shooting, and the imaging quality of a picture is improved.

Description

Photographing method based on double cameras, terminal and computer readable storage medium
Technical Field
The invention relates to the technical field of electronics, in particular to a photographing method based on double cameras, a terminal and a computer readable storage medium.
Background
With the increasing popularity of mobile phones, tablet computers and other intelligent mobile terminals, the photographing function in the mobile terminal is widely applied. People can record daily life by utilizing the photographing function of the mobile terminal, and the quality of photos is increasingly pursued in the photographing process. There are many factors that affect the quality of a picture, such as the size (pixels) of a lens, the material of the lens, a zoom system, and a flash, and under the condition that the hardware of the camera is not changed, the adjustment of the exposure and the focal length is a key factor that affects the quality of the picture.
However, for non-professional people, people usually take pictures in an automatic focusing manner during the shooting process, and the focal length cannot be accurately adjusted according to different shooting distances, so that the shot pictures are not clear enough, and the image quality of the pictures is reduced.
Disclosure of Invention
The embodiment of the invention provides a photographing method based on double cameras, a terminal and a computer readable storage medium, which can be used for photographing to obtain a clear image of a target object and improving the imaging quality of a photo.
In a first aspect, an embodiment of the present invention provides a photographing method based on two cameras, where the method includes:
acquiring a preview image through a first camera and a second camera; wherein the preview image contains a target object to be photographed; the number of the target objects is at least two;
acquiring a first distance value between the first camera and the second camera in the vertical direction, acquiring a first visual angle value corresponding to the first camera and acquiring a second visual angle value corresponding to the second camera;
determining a target shooting distance value between the target object and the terminal according to the first distance value, the first visual angle value and the second visual angle value;
obtaining an expected height difference between the first target object and the second target object;
determining and prompting a moving direction and a moving distance value corresponding to the first target object or the second target according to a first image height value corresponding to the first target object, a second image height value corresponding to the second target object and the expected height difference value;
acquiring a shooting parameter corresponding to the target distance value according to a preset corresponding relation between the shooting distance value and the shooting parameter; wherein the shooting parameters at least comprise a focal length;
and if the shooting instruction is detected, shooting a picture according to the shooting parameters.
In a second aspect, an embodiment of the present invention provides a terminal, where the terminal includes a unit configured to perform the method of the first aspect.
In a third aspect, an embodiment of the present invention provides another terminal, which includes a processor, an input device, an output device, and a memory, where the processor, the input device, the output device, and the memory are connected to each other, where the memory is used to store a computer program that supports the terminal to execute the foregoing method, and the computer program includes program instructions, and the processor is configured to call the program instructions to execute the foregoing method according to the first aspect.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, in which a computer program is stored, the computer program comprising program instructions, which, when executed by a processor, cause the processor to perform the method of the first aspect.
According to the embodiment of the invention, the target shooting distance value between the target object and the terminal is determined through the visual angle values corresponding to the first camera and the second camera and the distance value between the first camera and the second camera, the shooting parameter corresponding to the target shooting distance value is determined, the shooting parameter comprises the focal length, and the shooting is carried out according to the determined shooting parameter. The terminal can accurately adjust the focal length of the camera according to the target shooting distance value, clear images of the target object are obtained through shooting, and the imaging quality of the pictures is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic flow chart of a photographing method based on two cameras according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a shooting scene provided by an embodiment of the invention;
fig. 3 is a schematic flowchart of a photographing method based on dual cameras according to another embodiment of the present invention;
FIG. 4 is a diagram of a shooting scene according to another embodiment of the present invention;
fig. 5 is a schematic block diagram of a terminal according to an embodiment of the present invention;
fig. 6 is a schematic block diagram of a terminal according to another embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the specification of the present invention and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
In particular implementations, the terminals described in embodiments of the invention include, but are not limited to, other portable devices such as mobile phones, laptop computers, or tablet computers having touch sensitive surfaces (e.g., touch screen displays and/or touch pads). It should also be understood that in some embodiments, the device is not a portable communication device, but is a desktop computer having a touch-sensitive surface (e.g., a touch screen display and/or touchpad).
In the discussion that follows, a terminal that includes a display and a touch-sensitive surface is described. However, it should be understood that the terminal may include one or more other physical user interface devices such as a physical keyboard, mouse, and/or joystick.
The terminal supports various applications, such as one or more of the following: a drawing application, a presentation application, a word processing application, a website creation application, a disc burning application, a spreadsheet application, a gaming application, a telephone application, a video conferencing application, an email application, an instant messaging application, an exercise support application, a photo management application, a digital camera application, a web browsing application, a digital music player application, and/or a digital video player application.
Various applications that may be executed on the terminal may use at least one common physical user interface device, such as a touch-sensitive surface. One or more functions of the touch-sensitive surface and corresponding information displayed on the terminal can be adjusted and/or changed between applications and/or within respective applications. In this way, a common physical architecture (e.g., touch-sensitive surface) of the terminal can support various applications with user interfaces that are intuitive and transparent to the user.
Referring to fig. 1, a schematic flowchart of a photographing method based on two cameras according to an embodiment of the present invention is provided. The main executing body of the photographing method of the embodiment is a terminal with two rear cameras, and the terminal may include, but is not limited to, a mobile terminal such as a smart phone, a tablet computer, or a personal digital assistant PDA. The photographing method as shown in the figure may include:
s101: acquiring a preview image through a first camera and a second camera; wherein the preview image contains a target object to be photographed.
When the terminal detects that a user clicks a photographing application or calls a photographing function through other application programs, the terminal starts a first camera and a second camera and acquires a preview image containing a target object to be photographed through the first camera and the second camera. The first camera and the second camera are rear cameras.
The first camera and the second camera may be analog cameras or digital cameras, the cameras include image sensors, and the image sensors may be charge-coupled devices (CCDs) or Complementary Metal Oxide Semiconductors (CMOSs), which is not limited herein.
The size of the image sensor in the camera may include, but is not limited to, 1/3 inches or 1/4 inches, such as 1/3CCD, 1/4 CCD. The size of the image sensor is usually described by the diagonal length of the image sensor, such as 1/3CCD, which indicates that the diagonal length of the CCD is 1/3. The CDD with a diagonal length of 1/3 inches is greater than the 1/4 inch CCD. The larger the size of the CCD, the shallower the depth of field is at the same focal length aperture. That is, the better the "blurring effect" will be.
The target object may be a human being, or may be an animal, a plant, or an inanimate object (e.g., a building, etc.). When the target object is a person or an animal, the target object may be stationary or may be dynamically moving, which is not limited herein. The number of the target objects may be one or at least two, and is not limited herein.
The target object to be shot in the preview image can be identified by the user through manual focusing or can be identified automatically by the terminal. Specifically, the terminal may identify an object with the highest degree of focus or located in the middle of the image as a target object through an image recognition technology, and may further prompt the user with the target object identified by the terminal, so that the user may determine whether the object is correct.
S102: the method comprises the steps of obtaining a first distance value of the first camera and the second camera in the vertical direction, obtaining a first visual angle value corresponding to the first camera and obtaining a second visual angle value corresponding to the second camera.
The first distance value between the first camera and the second camera in the vertical direction may be pre-stored in the terminal, or may be input by a user, which is not limited herein.
The terminal can respectively acquire a first visual angle value corresponding to the lens focal length of the first camera and a second visual angle value corresponding to the lens focal length of the second camera according to the preset corresponding relation between the lens focal length and the visual angle value. The camera focal length of the camera corresponds to the visual angle (visual angle) of the camera one by one, and when the terminal acquires the camera focal length of the camera, the terminal can acquire a first visual angle corresponding to the first camera focal length and a second visual angle corresponding to the second camera focal length.
Specifically, the terminal acquires configuration parameter information corresponding to a first camera and a second camera which are built in, and respectively acquires a first visible angle value corresponding to a lens focal length and a second visible angle value corresponding to a lens focal length of the second camera according to a preset corresponding relationship among the lens focal length, the configuration parameter information of the cameras and the visible angle values.
The configuration parameter information of the camera comprises a lens focal length and parameter information of the image sensor, and the parameter information of the image sensor comprises the type of the image sensor and the length of a diagonal line of the image sensor. The terminal stores the following data in advance: and the preset corresponding relation among the focal length of the lens, the configuration parameter information of the camera and the visual angle value. The focal length of the lens is the distance from the optical back principal point of the lens to the focal point, and is an important performance index of the lens. For the same shooting object and shooting distance, the larger the focal length of the lens of the camera is, the smaller the imaging is, and the smaller the visual angle is; the smaller the focal length of the lens of the camera is, the smaller the imaging is and the larger the visual angle is.
For example, when the image sensor of the first camera built in the terminal is a CCD, the terminal may obtain a first visible angle value corresponding to the lens focal length of the first camera according to the lens focal length of the first camera and a length lookup table one of a diagonal line of the CCD.
Table one:
Figure GDA0002074598900000061
for example, when the focal length of the lens captured by the terminal into the first camera is 2.8 mm, the image sensor of the first camera is a CCD, and the length of the diagonal line of the CCD is 1/3 inches, the terminal lookup table may obtain a first visible angle value of 89.9 ° corresponding to the focal length of the lens of 4 mm.
S103: and determining a target shooting distance value between the target object and the terminal according to the first distance value, the first visual angle value and the second visual angle value.
Referring to fig. 2, fig. 2 is a schematic view of a shooting scene according to an embodiment of the invention. The shooting scene shown in fig. 2 includes a first camera 11 and a second camera 12 of the terminal, and a target object 20, wherein the visible angle of the first camera 11 of the terminal is θ1The visible angle of the second camera 12 is theta2The intersection point of the target object 20 and the ground is a, and the distance between the first camera 11 and the second camera 12 is D.
In a triangle composed of the first camera 11, the second camera 12, and the point a, it can be found that the sum of the internal angles of the triangle is equal to 180 degrees: (90+ θ)22) + β + γ 180, and β 90- θ12, therefore, γ is 180- (90- θ)1/2)-(90+θ2/2)=θ1/2-θ2/2。
With the position of the second camera 12 as the vertex, a perpendicular line is drawn to a first line segment composed of the first camera 11 and the point a, and the perpendicular line is perpendicular to the first line segment.
Assuming that the length of the perpendicular is k, then
Assume that the length of the first line segment is L1Then, the formula of the trigonometric function can be obtained as follows:
Figure GDA0002074598900000072
by the purpose between the target object and the terminalThe standard shooting distance S is equal to L1X arcsin beta, to obtain
Figure GDA0002074598900000073
It can be understood that, after calculating the length value k of the perpendicular line of the first line segment (composed of the first camera 11 and the point a), the terminal can also calculate the included angle α and the line segment L between the second camera 12 and the point a2And according to the included angle alpha and the line segment L between the second camera 12 and the point a2And calculating a target shooting distance S between the target object and the terminal. Wherein the content of the first and second substances,
in other embodiments, the terminal may also obtain a first shooting height value corresponding to the first camera, and calculate a target shooting distance value between the target object and the terminal according to the first shooting height value and the first visual angle value of the first camera; or acquiring a second shooting height value corresponding to the second camera, and calculating a target shooting distance value between the target object and the terminal according to the second shooting height value and a second visual angle value of the second camera.
Wherein, the first shooting height d corresponding to the first camera1=L1X arcsin (90- β - γ), and a second shooting height h ═ L corresponding to the second camera2×arccos(90-θ2/2), or may be according to h ═ d1D is calculated, and D is the distance between the first camera and the second camera in the vertical direction. Will respectively convert the above known L1、β、γ、L2Substitution into d1H is calculated to obtain d1、h。
The following description will take an example in which the terminal calculates a target photographing distance value between the target object and the terminal based on the second photographing height value and the second viewing angle value of the second camera.
Assuming that the second photographing height is h,the distance between the terminal and the target object 20 in the horizontal direction is S, and the terminal acquires the current shooting height value h and the second visual angle value theta corresponding to the second camera2Then, according to the formula S ═ h tan (90 ° - θ)2And/2) calculating to obtain the shooting distance value S.
It can be understood that the method for calculating the target shooting distance value between the target object and the terminal by the terminal according to the first shooting height value and the first visual angle value of the first camera is similar to the method for calculating the target shooting distance value between the target object and the terminal according to the second shooting height value and the second visual angle value of the second camera, and is not repeated here.
In other embodiments, the terminal may further obtain a shooting height value input by the user, or the terminal may determine the shooting height value through data detected by a sensor built in the terminal. The shooting height value is a distance value between the terminal and the ground when the camera collects the preview image, and the distance value is a height value of the terminal from the ground in the gravity direction.
The terminal can determine a shooting height value through a distance value detected by a sensor capable of measuring distance, such as an infrared sensor and an ultrasonic sensor; the terminal can also calculate the shooting height value through the movement speed in the gravity direction detected by a sensor capable of measuring the movement speed, such as a gravity sensor, an acceleration sensor or a gyroscope sensor, and the time required for moving from the ground to the shooting height.
S104: acquiring shooting parameters corresponding to the target distance value according to a preset corresponding relation between the shooting distance value and the shooting parameters, wherein the shooting parameters at least comprise a focal length; .
The preset corresponding relation between the shooting parameters and the shooting distance values is stored in the terminal in advance, and when the terminal obtains the target shooting distance value S between the target object and the terminal, the shooting parameters corresponding to the target shooting distance can be determined according to the preset corresponding relation between the shooting distance values and the shooting parameters.
The shooting parameters may include, but are not limited to, a focal length, and may further include an aperture value, a shutter speed, sensitivity, an exposure amount, and the like.
S105: and if the shooting instruction is detected, shooting a picture according to the shooting parameters.
And when the terminal detects a shooting instruction, shooting an image captured at the current visual angle according to shooting parameters corresponding to the target shooting distance, and forming a picture.
The shooting instruction may be triggered by the user through a shooting key, or may be triggered automatically by the terminal according to a timing shooting time, which is not limited herein.
According to the scheme, the terminal determines a target shooting distance value between a target object and the terminal through the visual angle values corresponding to the first camera and the second camera and the distance value between the first camera and the second camera, determines shooting parameters corresponding to the target shooting distance value, wherein the shooting parameters comprise focal lengths, and shoots according to the determined shooting parameters. The terminal can accurately adjust the focal length of the camera according to the target shooting distance value, clear images of the target object are obtained through shooting, and the imaging quality of the pictures is improved.
Referring to fig. 3, a schematic flowchart of a photographing method based on two cameras according to an embodiment of the present invention is provided. The main executing body of the photographing method of the embodiment is a terminal, and the terminal may include, but is not limited to, a mobile terminal such as a smart phone, a tablet computer, or a personal digital assistant PDA. The photographing method as shown in the figure may include:
s201: acquiring a preview image through a first camera and a second camera; wherein the preview image contains a target object to be photographed.
S201 in this embodiment is the same as S101 in the previous embodiment, and please refer to the related description of S101 in the previous embodiment, which is not repeated herein.
S202: the method comprises the steps of obtaining a first distance value of the first camera and the second camera in the vertical direction, obtaining a first visual angle value corresponding to the first camera and obtaining a second visual angle value corresponding to the second camera.
S202 in this embodiment is the same as S102 in the previous embodiment, and please refer to the related description of S102 in the previous embodiment, which is not repeated herein.
S203: calculating the length of a first line segment and a first included angle value according to the first distance value, the first visual angle value and the second visual angle value; a first end point of the first line segment is a first position of the first camera or a second position of the second camera, and a second end point of the first line segment is an intersection point of the target object and the supporting surface; if the first end point of the first line segment is the first position of the first camera, the first included angle is an included angle formed by the first line segment and the first camera in the gravity direction; and if the first end point of the first line segment is the second position of the second camera, the first included angle is an included angle formed by the first line segment and the second camera in the gravity direction.
The support surface refers to a surface of a support that currently supports a target object, and the surface is in contact with the target object. The surface of the support currently supporting the target object may be a surface of a ground or a floor of a building. For example, when the target object is standing on the ground, the support surface is the ground level; when the target object stands in or on a building, the support surface is a plane corresponding to the floor surface on which the target object is currently located. The support surface is a surface within the vehicle for carrying or supporting the target object when the target object is within the vehicle. The intersection of the target object with the support surface refers to the support point of the target object with respect to the support surface in contact with the target object.
Referring to fig. 2, fig. 2 is a schematic view of a shooting scene according to an embodiment of the invention. The shooting scene shown in fig. 2 includes a first camera 11 and a second camera 12 of the terminal, and a target object 20, wherein the visible angle of the first camera 11 of the terminal is θ1The visible angle of the second camera 12 is theta2The intersection point of the target object 20 and the ground (support surface) is a, and the distance between the first camera 11 and the second camera 12 is D. When the first end point of the first line segment is the first position of the first camera 11, the first line segment is a line segment L composed of the first camera 11 and the point a1The first included angle is beta; when the first end point of the first line segmentWhen the first position of the second camera 12 is reached, the first line segment is the line segment L composed of the second camera 12 and the point a2The first included angle is alpha.
Wherein β is 90- θ1/2,α=90-θ2/2。
In a triangle composed of the first camera 11, the second camera 12, and the point a, it can be found that the sum of the internal angles of the triangle is equal to 180 degrees: (90+ θ)22) + β + γ 180, and β 90- θ12, therefore, γ is 180- (90- θ)1/2)-(90+θ2/2)=θ1/2-θ2/2。
With the position of the second camera 12 as the vertex, a perpendicular line is drawn to a first line segment composed of the first camera 11 and the point a, and the perpendicular line is perpendicular to the first line segment.
Assuming that the length of the perpendicular is k, then
Figure GDA0002074598900000101
Assume that the length of the first line segment is L1Then, the formula of the trigonometric function can be obtained as follows:
Figure GDA0002074598900000102
it will be appreciated that the terminal calculates the first line segment L1After the length value k of the perpendicular (consisting of the first camera 11 and the point a), the angle α and the line segment L between the second camera 12 and the point a can also be calculated2And according to the included angle alpha and the line segment L between the second camera 12 and the point a2And calculating a target shooting distance S between the target object and the terminal. Wherein the content of the first and second substances,
Figure GDA0002074598900000103
s204: and calculating a target shooting distance value between the target object and the terminal according to the length of the first line segment and the first included angle value.
Wherein the terminal captures a distance S-L from the target object to the terminal1×arcsinβ,And length of the first line segment
Figure GDA0002074598900000104
It can be calculated as follows: target shooting distance
Figure GDA0002074598900000105
Or the terminal shoots the distance S as L by the target between the target object and the terminal2X arcsin alpha, and length of first line segment
Figure GDA0002074598900000106
The shooting distance of the target can be calculated
Optionally, when the target objects are people and the number of the target objects is at least two, after the step S202 is executed, the photographing method in this embodiment further includes the steps of S205 to S206, and the step S204 and the step S205 are executed in a non-sequential order. The method comprises the following specific steps:
s205: an expected height difference between the first target object and the second target object is obtained.
The expected height difference between the first target object and the second target object can be input by the user in real time, preset and stored in the terminal, or determined by the user from at least two candidate expected height differences provided by the terminal. The expected height difference value can be specifically set according to actual needs, and is not limited here.
The expected height difference may be the height difference corresponding to achieving the "most eruptive height difference" effect. The height difference of about 30 cm in the network is referred to as the "most budding height difference", and in this embodiment, the expected height difference may be 30 cm, or may be set to another value.
S206: and determining and prompting a moving direction and a moving distance value corresponding to the first target object or the second target according to a first image height value corresponding to the first target object, a second image height value corresponding to the second target object and the expected height difference value.
When the terminal acquires the current shooting height value, a first image height value corresponding to the first target object and a second image height value corresponding to the second target object can be respectively calculated according to the current shooting height value and the number of pixel points contained in the image with the same length as the current shooting height value.
Further, S206 may include S2061 to S2063.
S2061: and calculating a first shooting height value corresponding to the first camera according to the first distance value, the first visual angle value and the second visual angle value, or calculating a second shooting height value corresponding to the second camera.
Wherein, the first shooting height d corresponding to the first camera1=L1X arcsin (90- β - γ), and a second shooting height h ═ L corresponding to the second camera2×arccos(90-θ2/2), or may be according to h ═ d1D is calculated, and D is the distance between the first camera and the second camera in the vertical direction. Will respectively identify L known in S2041、β、γ、L2Substitution into d1H is calculated to obtain d1、h。
S2062: and calculating a first image height value corresponding to the first target object and a second image height value corresponding to the second target object according to the first shooting height value or the second shooting height value.
When the terminal acquires the first shooting height value corresponding to the first camera, the terminal can respectively calculate a first image height value corresponding to the first target object and a second image height value corresponding to the second target object according to the first shooting height value and the number of pixel points contained in the image with the same length as the current shooting height value.
Or when the terminal acquires the second shooting height value corresponding to the second camera, the terminal may calculate a first image height value corresponding to the first target object and a second image height value corresponding to the second target object according to the second shooting height value and the number of pixel points included in the image having the same length as the current shooting height value.
In the following, 2 target objects are included in the preview image, and the image height of the target object is measured by the second camera of the terminal, as an example, it can be understood that the method for measuring the image height of the target object by the first camera is the same as the method for measuring the image height of the target object by the second camera of the terminal. In other embodiments, when it is desired to achieve height differences between multiple target objects that are all desired height differences, the implementation manner that achieves height differences between two target objects that are all desired height differences may be used.
Specifically, referring to fig. 4, fig. 4 is a schematic diagram of a shooting scene according to another embodiment of the present invention. The shooting scene shown in fig. 4 includes the second camera 12 of the terminal, the first target object 20 and the second target object 30, wherein the visible angle of the camera of the terminal 10 is θ2The height between the terminal second camera 12 and the ground is h, and the distance between the terminal 10 and the target object 20 in the horizontal direction is S.
From FIG. 4, it can be seen that the first image height H corresponding to the second target object1=Hab+HbcWherein H isabIs the same as the current shooting height value H, then Hab=h。
Terminal acquisition HabA first total number N of included pixels1And obtaining HbcSecond total number N of pixel points2Calculating HbcAnd HbdIn which H isbdH. Due to the second total number N2And a first total number N1Is equal to HbcAnd HbdThe terminal can be according to the formula (1)
Figure GDA0002074598900000121
Calculating to obtain HbcAt this time, the formula (1) and HabH is substituted by H1=Hab+HbcObtaining the formula (2):
Figure GDA0002074598900000122
and (3) the terminal can obtain a first image height value corresponding to the first target object in the preview image according to the formula (2).
S2063: and determining and prompting a moving direction and a moving distance value corresponding to the first target object or the second target according to a first image height value corresponding to the first target object, a second image height value corresponding to the second target object and the expected height difference value.
The terminal can calculate and obtain a second image height value H corresponding to a second target object according to the method2And the terminal determines the total number of the pixel points required to be experienced when the first image height value of the first target object and the second image height value of the second target object reach the expected height difference value according to the expected height difference value, the total number of the pixel points corresponding to the expected height difference value and the distance corresponding to one pixel point, and determines the final position of the first target object according to the pixel points corresponding to the top point of the first image height value and the total number of the pixel points required to be experienced, or determines the final position of the second target object according to the pixel points corresponding to the top point of the second image height value and the total number of the pixel points required to be experienced.
It can be understood that the terminal may determine a final position to be reached by the second target object after moving according to the initial position of the first target object, according to the position of the first target object being kept unchanged; or determining the final position to be reached after the first target object moves according to the initial position of the second target object while keeping the position of the second target object unchanged; the final position to be reached after the first target object moves and the final position to be reached after the second target object moves can be determined according to the initial position of the first target object and the initial position of the second target object, and both the first target object and the second target object need to move at the moment.
And then, the terminal determines the distance and the moving direction of the first target object to be moved according to the starting position a corresponding to the first target object and the final position of the first target object, or determines the distance and the moving direction of the second target object to be moved according to the starting position a' corresponding to the second target object and the final position of the second target object.
The terminal prompts the moving direction and the moving distance of the first target object to a photographing user so that the photographing user guides the first target object to move to a final position, which is determined by the terminal and needs to be reached, of the first target object; or prompting the moving direction and the moving distance of the second target object to the photographing user so that the photographing user guides the second target object to move to the final position, which is determined by the terminal and needs to be reached, of the second target object, and therefore the height difference between the first target object and the second target object in the photographed photo can be the expected height difference value.
In other embodiments, the terminal may prompt the user to perform physical measurement on an image of a target object in the preview image by using a length measurement tool, so as to obtain a first image height value corresponding to a first target object and a second image height value corresponding to a second target object, and then receive the first image height value and the second image height value input by the user through human-computer interaction. And then, the terminal determines and prompts the distance and the moving direction of the first target object or the second target object which need to move according to the method.
When the photographing method does not include S205 to S206, the terminal may perform S207 after performing S204; when the photographing method includes S205 to S206, the terminal executes S207 after executing S204 and S206.
S207: acquiring a shooting parameter corresponding to the target distance value according to a preset corresponding relation between the shooting distance value and the shooting parameter; wherein the shooting parameters at least comprise a focal length.
The preset corresponding relation between the shooting parameters and the shooting distance values is stored in the terminal in advance, and when the terminal obtains the target shooting distance value L between the target object and the terminal, the shooting parameters corresponding to the target shooting distance can be determined according to the preset corresponding relation between the shooting distance values and the shooting parameters.
The shooting parameters may include, but are not limited to, a focal length, which is a focal length at the time of imaging, and may further include an aperture value, a shutter speed, sensitivity, an exposure amount, and the like. When the distances between the target object and the camera are different, the distance between the lens and the image sensor (imaging surface) needs to be adjusted for clear imaging, and the adjusting process is a focusing process; that is, when the target object is at a different distance from the camera, the imaging focal length (the distance between the lens and the sensor) is different.
The aperture is generally expressed by f, such as f/1.4, f/8 and the like, the aperture value is used for adjusting the light incoming quantity of the camera, and the smaller the aperture value is, the larger the aperture is, more light can pass through; the size of the aperture and its number are exactly opposite, so f/1.4 is the large aperture and f/8 is the small aperture.
The faster the shutter speed, the less the exposure. The sensitivity indicates the degree of sensitivity of an image sensor of the camera to light. The sensitivity is expressed in ISO plus numbers, the higher the number, the higher the sensitivity, and the more sensitive the image sensor is to light.
Further, S207 may include S2071: and acquiring a focal length and an aperture value corresponding to the target distance value according to the preset corresponding relation between the shooting distance value and the shooting parameters.
The terminal stores a preset corresponding relation between a shooting distance value and a shooting parameter in advance, the shooting parameter comprises a focal length and an aperture value, and the terminal acquires the focal length and the aperture value corresponding to the target shooting distance value when determining the target shooting distance value between a target object and the terminal.
Optionally, the photographing method may further include S208: and acquiring the illumination value of the current ambient light, and determining an exposure value corresponding to the illumination value.
The terminal stores a preset corresponding relation between the illumination value and the exposure value of the ambient light in advance, and when the terminal acquires the current illumination value of the ambient light, the exposure value corresponding to the illumination value can be determined according to the preset corresponding relation between the illumination value and the exposure value of the ambient light.
S208 and S202 to S207 are not sequentially executed, and S208 may be executed when S202 to S207 are executed.
S209: and if the shooting instruction is detected, shooting a picture according to the shooting parameters.
And when the terminal detects a shooting instruction, shooting an image captured at the current visual angle according to shooting parameters corresponding to the target shooting distance, and forming a picture.
The shooting instruction may be triggered by the user through a shooting key, or may be triggered automatically by the terminal according to a timing shooting time, which is not limited herein.
The shooting instruction may be triggered by the user through a shooting key, or may be triggered automatically by the terminal according to a timing shooting time, which is not limited herein.
When the terminal performs S2071 and S208, S209 may include: and if the shooting instruction is detected, shooting a picture according to the exposure value corresponding to the illumination value, the focal length corresponding to the target distance value and the aperture value.
Further, S209 may include S2091 to S2092.
S2091: and if the shooting instruction is detected, acquiring the movement speed of the target object.
And when the terminal detects a shooting instruction, acquiring the motion speed of the focused target object. When the movement speed of any target object is greater than that of the snacks, S2092 is executed; when the moving speed of the target object is equal to zero, photographing is performed according to the photographing parameters determined in S207.
S2092: and if the movement speed is greater than zero, determining the target position of the target object according to the movement speed, adjusting the shooting parameters according to the target position, and shooting according to the adjusted shooting parameters.
When the terminal detects that the movement speed of the target object is larger than zero, recording a first position, a second position in the movement process and the movement time from the first position to the second position, calculating the movement distance according to the first position and the second position, calculating the average speed of the target object from the first position to the second position according to the calculated movement distance and the acquired movement time, and predicting the target position of the target object at the next moment according to the average speed of the target object. The next moment is after the moment of moving to the second position. The next time may be 0.1 seconds or 0.5 seconds, etc. after the time of moving to the second position.
The terminal determines the current shooting distance according to the target position of the target object, acquires shooting parameters corresponding to the current shooting distance, adjusts the shooting parameters obtained in the step S207 according to the shooting parameters corresponding to the current shooting distance, and shoots according to the adjusted shooting parameters.
Specifically, the terminal may calculate a ratio of the total number of pixels included in the first shooting distance to the total number of pixels included in the line segment between the initial position and the target position, calculate a distance between the initial position and the target position according to the ratio and the first shooting distance corresponding to the initial position, and add the distance and the first shooting distance corresponding to the initial position to obtain a second shooting distance (i.e., a current shooting distance) corresponding to the target position.
The method for calculating the distance between the initial position and the target position according to the ratio of the number of the pixel points may refer to the above method for calculating the image height according to the number of the pixel points, which is not repeated herein.
The terminal can determine shooting parameters which should be adopted when the terminal takes a picture according to the current shooting distance, wherein the shooting parameters which should be adopted when the terminal takes the picture include but are not limited to a focal length, and further include an aperture value which should be adopted, a photosensitive value which should be adopted, a shutter time and the like.
The terminal adjusts the shooting parameters determined in S207 to the shooting parameters determined as described above to be employed.
Further, when the terminal executes S2071 and S208, S2092 may specifically be: if the movement speed is larger than zero, determining the target position of the target object according to the movement speed, adjusting the focal length and the aperture value corresponding to the target distance value according to the target position to shoot the picture, and shooting the picture according to the exposure value corresponding to the current illumination value, the adjusted focal length and the adjusted aperture value.
Optionally, after S204, the photographing method of the present embodiment further includes S210 to S213. Specifically, the method comprises the following steps:
s210: and determining the size information of the target shooting range corresponding to the target shooting distance according to the preset corresponding relation among the focal length of the lens, the shooting distance and the size information of the shooting range.
The terminal stores preset corresponding relations among the lens focal length and the shooting distance of the camera and the size information of the shooting range (or the capture range of the lens) in advance, determines the size information of the target shooting range corresponding to the lens focal length of the camera under the target shooting distance according to the lens focal length of the camera built in the terminal and the target shooting distance between the terminal and a target object, and obtains the length value and the width value of the size of the target shooting range. Please refer to table two, for a preset corresponding relationship between the focal length of the lens of the camera and the size information of the shooting range (or the capturing range of the lens).
Table two:
Figure GDA0002074598900000161
Figure GDA0002074598900000171
for example, when the focal length of the lens of the camera is 2.8 millimeters (mm) and the shooting distance is 5 meters (m), the target shooting range is an area corresponding to 13 meters long and 9.8 meters wide.
Wherein, S2010 and S207 are parallel steps and are not executed in sequence.
S211: and determining an image height value corresponding to the target object according to the target visual angle value and the shooting height value.
The method for calculating the image height value corresponding to the target object may refer to the related description in S206, which is not repeated herein.
S212: and determining an image width value corresponding to the target object according to the size information of the target shooting range and the image height value corresponding to the target object.
Because the first ratio of the image height value of the target object to the width value corresponding to the target shooting range and the second ratio of the image width value of the target object to the length value corresponding to the target shooting range are equal, when the terminal calculates the image height value of the target object according to the method in S206, the image width value corresponding to the target object can be calculated according to the length value corresponding to the target shooting range, the ratio of the image height value of the target object to the width value corresponding to the target shooting range. The image width value refers to the width value of the target object in the preview image just facing the camera.
S213: and determining and outputting a trimming parameter corresponding to the target object in the picture according to the size information of the target shooting range, the image height value corresponding to the target object and the image width value corresponding to the target object.
For example, the terminal may calculate a length ratio between a length value corresponding to the target shooting range and an image height value corresponding to the target object, and obtain a width ratio corresponding to the length ratio according to a preset correspondence between the preset length ratio and the preset width ratio, so as to calculate a target image width value corresponding to the target according to the calculated width ratio and the width value corresponding to the target shooting range, and output a cropping parameter for adjusting the image width corresponding to the target object according to a difference between the target image width value corresponding to the target object and the image width value, so that when the terminal obtains a picture, the terminal performs cropping according to the cropping parameter to beautify the picture.
According to the scheme, the terminal determines a target shooting distance value between a target object and the terminal through the visual angle values corresponding to the first camera and the second camera and the distance value between the first camera and the second camera, determines shooting parameters corresponding to the target shooting distance value, wherein the shooting parameters comprise focal lengths, and shoots according to the determined shooting parameters. The terminal can accurately adjust the focal length of the camera according to the target shooting distance value, clear images of the target object are obtained through shooting, and the imaging quality of the pictures is improved.
The terminal determines an exposure value corresponding to the illumination value through the illumination value of the current ambient light, and photographs according to the exposure value, a focal length and an aperture value corresponding to a target photographing distance, without manually setting the exposure value, the focal length, the aperture and other photographing parameters by a user, the situation that the photographed picture obtained by photographing is not clear due to the fact that the photographing parameters manually adjusted by the user are not matched with the current photographing scene can be prevented, the whole exposure value can be accurately set, the focal length and the aperture value corresponding to the target photographing distance are improved, the imaging definition of a target object is improved, and the imaging quality of the picture is further improved.
The terminal can output the retouching parameters according to the imaging width value and the imaging height value of the target object, so that the shot picture can be beautified according to the retouching parameters to adjust the figure proportion in the picture.
The terminal can predict the shooting distance of the moving target object in the moving process, so that the moving target object is focused, and the imaging quality of the moving target object is improved.
The terminal can determine and prompt the moving direction and the moving distance value corresponding to the first target object or the second target according to the expected height difference value and the image height value corresponding to any two target objects, so that the effect of the highest bud height difference among the target objects in the photo is achieved.
The embodiment of the invention also provides a terminal, which is used for executing the unit of the photographing method in any one of the preceding items. Specifically, referring to fig. 5, a schematic block diagram of a terminal according to an embodiment of the present invention is shown. The terminal 5 of the present embodiment includes: an image acquisition unit 501, a first acquisition unit 502, a first determination unit 503, a second acquisition unit 504, and a photographing unit 505.
An image acquisition unit 501, configured to acquire a preview image through a first camera and a second camera; wherein the preview image contains a target object to be photographed;
a first obtaining unit 502, configured to obtain a first distance value between the first camera and the second camera in a vertical direction, obtain a first visual angle value corresponding to the first camera, and obtain a second visual angle value corresponding to the second camera;
a first determining unit 503, configured to determine a target shooting distance value between the target object and the terminal according to the first distance value, the first visual angle value, and the second visual angle value;
a second obtaining unit 504, configured to obtain a shooting parameter corresponding to the target distance value according to a preset corresponding relationship between the shooting distance value and the shooting parameter; wherein the shooting parameters at least comprise a focal length;
and a photographing unit 505, configured to take a picture according to the photographing parameters if the photographing instruction is detected.
Further, the photographing unit 505 specifically includes:
a detection unit 5051 configured to acquire a movement speed of the target object if the photographing instruction is detected;
the photographing unit 5052 is configured to determine a target position of the target object according to the motion speed if the motion speed is greater than zero, adjust the photographing parameter according to the target position, and photograph according to the adjusted photographing parameter.
Optionally, the first determining unit 503 includes:
a first calculating unit 5031, configured to calculate a length of a first line segment and a first included angle value according to the first distance value, the first viewing angle value, and the second viewing angle value; a first end point of the first line segment is a first position of the first camera or a second position of the second camera, and a second end point of the first line segment is an intersection point of the target object and the supporting surface; if the first end point of the first line segment is the first position of the first camera, the first included angle is an included angle formed by the first line segment and the first camera in the gravity direction; if the first end point of the first line segment is the second position of the second camera, the first included angle is an included angle formed by the first line segment and the second camera in the gravity direction;
a second calculating unit 5032, configured to calculate a target shooting distance value between the target object and the terminal according to the length of the first line segment and the first included angle value.
Optionally, the terminal further includes:
a shooting range determining unit 506, configured to determine, when the first determining unit 503 determines the target shooting distance, size information of a target shooting range corresponding to the target shooting distance according to a preset correspondence relationship between a lens focal length, the shooting distance, and the size information of the shooting range;
an image height determining unit 507, configured to determine an image height value corresponding to the target object according to the target viewing angle value and the shooting height value acquired by the first acquiring unit 502;
an image width determining unit 508, configured to determine an image width value corresponding to the target object according to the size information of the target shooting range and the image height value corresponding to the target object;
a trimming parameter determining unit 509, configured to determine and output a trimming parameter corresponding to the target object in the photo captured by the capturing unit 5052 according to the size information of the target capturing range, the image height value corresponding to the target object, and the image width value corresponding to the target object.
Optionally, the number of the target objects is at least two, and the terminal may further include:
a third obtaining unit 510, configured to obtain a desired height difference value between the first target object and the second target object;
the second determining unit 511 is configured to determine and prompt a moving direction and a moving distance value corresponding to the first target object or the second target according to the first image height value corresponding to the first target object, the second image height value corresponding to the second target object, and the expected height difference value, and after the second determining unit 511 prompts the moving direction and the moving distance value corresponding to the first target object or the second target, the second determining unit 511 sends notification information to the photographing unit 505 to notify the photographing unit 505 of detecting a photographing instruction.
Further, the second determining unit 511 may include:
a shooting height calculating unit 5111, configured to calculate a first shooting height value corresponding to the first camera according to the first distance value, the first visual angle value, and the second visual angle value, or calculate a second shooting height value corresponding to the second camera;
an image height calculating unit 5112, configured to calculate a first image height value corresponding to the first target object and a second image height value corresponding to the second target object according to the first shooting height value or the second shooting height value;
a reminding unit 5113, configured to determine and prompt a moving direction and a moving distance value corresponding to the first target object or the second target according to the first image height value corresponding to the first target object, the second image height value corresponding to the second target object, and the expected height difference value.
Specifically, the terminal may further include:
a fourth obtaining unit 512, configured to obtain an illumination value of the current ambient light, and determine an exposure value corresponding to the illumination value;
the second obtaining unit 504 is specifically configured to: acquiring a focal length and an aperture value corresponding to the target distance value according to a preset corresponding relation between the shooting distance value and the shooting parameter;
the photographing unit 504 is specifically configured to: and if the shooting instruction is detected, shooting a picture according to the exposure value corresponding to the illumination value, the focal length corresponding to the target distance value and the aperture value.
According to the scheme, the terminal determines the target shooting distance value between the target object and the terminal according to the current shooting height value and the visual angle value of the camera, determines the shooting parameters corresponding to the target shooting distance value, wherein the shooting parameters comprise the focal length, and shoots according to the determined shooting parameters. The terminal can accurately adjust the focal length of the camera according to the target shooting distance value, clear images of the target object are obtained through shooting, and the imaging quality of the pictures is improved.
The terminal determines an exposure value corresponding to the illumination value through the illumination value of the current ambient light, and photographs according to the exposure value, a focal length and an aperture value corresponding to a target photographing distance, without manually setting the exposure value, the focal length, the aperture and other photographing parameters by a user, the situation that the photographed picture obtained by photographing is not clear due to the fact that the photographing parameters manually adjusted by the user are not matched with the current photographing scene can be prevented, the whole exposure value can be accurately set, the focal length and the aperture value corresponding to the target photographing distance are improved, the imaging definition of a target object is improved, and the imaging quality of the picture is further improved.
The terminal can output the retouching parameters according to the imaging width value and the imaging height value of the target object, so that the shot picture can be beautified according to the retouching parameters to adjust the figure proportion in the picture.
The terminal can predict the shooting distance of the moving target object in the moving process, so that the moving target object is focused, and the imaging quality of the moving target object is improved.
The terminal can determine and prompt the moving direction and the moving distance value corresponding to the first target object or the second target according to the expected height difference value and the image height value corresponding to any two target objects, so that the effect of the highest bud height difference among the target objects in the photo is achieved.
Referring to fig. 6, a schematic block diagram of a terminal according to another embodiment of the present invention is shown. The terminal 6 in the present embodiment as shown in the figure may include: one or more processors 601; one or more input devices 602, one or more output devices 603, and memory 604. The processor 601, the input device 602, the output device 603, and the memory 604 are connected by a bus 605. The memory 604 is used to store computer programs comprising program instructions, and the processor 601 is used to execute the program instructions stored by the memory 604. Wherein the processor 601 is configured to call the program instruction to perform:
acquiring a preview image through a first camera and a second camera; wherein the preview image contains a target object to be photographed;
acquiring a first distance value between the first camera and the second camera in the vertical direction, acquiring a first visual angle value corresponding to the first camera and acquiring a second visual angle value corresponding to the second camera;
determining a target shooting distance value between the target object and the terminal according to the first distance value, the first visual angle value and the second visual angle value;
acquiring a shooting parameter corresponding to the target distance value according to a preset corresponding relation between the shooting distance value and the shooting parameter; wherein the shooting parameters at least comprise a focal length;
and if the shooting instruction is detected, shooting a picture according to the shooting parameters.
Optionally, the processor 601 is specifically configured to invoke the program instructions to perform:
calculating the length of a first line segment and a first included angle value according to the first distance value, the first visual angle value and the second visual angle value; a first end point of the first line segment is a first position of the first camera or a second position of the second camera, and a second end point of the first line segment is an intersection point of the target object and the supporting surface; if the first end point of the first line segment is the first position of the first camera, the first included angle is an included angle formed by the first line segment and the first camera in the gravity direction; if the first end point of the first line segment is the second position of the second camera, the first included angle is an included angle formed by the first line segment and the second camera in the gravity direction;
and calculating a target shooting distance value between the target object and the terminal according to the length of the first line segment and the first included angle value.
Optionally, after determining the target shooting distance value between the target object and the terminal according to the shooting height value and the target visual angle value, the processor 601 is further configured to invoke the program instructions to perform:
determining the size information of a target shooting range corresponding to the target shooting distance according to the preset corresponding relation among the focal length of the lens, the shooting distance and the size information of the shooting range;
determining an image height value corresponding to the target object according to the target visual angle value and the shooting height value;
determining an image width value corresponding to the target object according to the size information of the target shooting range and the image height value corresponding to the target object;
and determining and outputting a trimming parameter corresponding to the target object in the picture according to the size information of the target shooting range, the image height value corresponding to the target object and the image width value corresponding to the target object.
Optionally, the number of the target objects is at least two; if the shooting instruction is detected, before the photo is shot according to the shooting parameters, the processor 601 is further configured to call the program instruction to perform:
obtaining an expected height difference between the first target object and the second target object;
and determining and prompting a moving direction and a moving distance value corresponding to the first target object or the second target according to a first image height value corresponding to the first target object, a second image height value corresponding to the second target object and the expected height difference value.
Optionally, the processor 601 is specifically configured to invoke the program instructions to perform:
calculating a first shooting height value corresponding to the first camera according to the first distance value, the first visual angle value and the second visual angle value, or calculating a second shooting height value corresponding to the second camera;
calculating a first image height value corresponding to the first target object and a second image height value corresponding to the second target object according to the first shooting height value or the second shooting height value;
and determining and prompting a moving direction and a moving distance value corresponding to the first target object or the second target according to a first image height value corresponding to the first target object, a second image height value corresponding to the second target object and the expected height difference value.
Optionally, the processor 601 is further configured to invoke the program instructions to perform:
if a shooting instruction is detected, acquiring the movement speed of the target object;
and if the movement speed is greater than zero, determining the target position of the target object according to the movement speed, adjusting the shooting parameters according to the target position, and shooting according to the adjusted shooting parameters.
Optionally, the processor 601 is configured to invoke the program instructions to perform:
acquiring an illumination value of current ambient light, and determining an exposure value corresponding to the illumination value;
acquiring a focal length and an aperture value corresponding to the target distance value according to a preset corresponding relation between the shooting distance value and the shooting parameter;
and if the shooting instruction is detected, shooting a picture according to the exposure value corresponding to the illumination value, the focal length corresponding to the target distance value and the aperture value.
It should be understood that, in the embodiment of the present invention, the Processor 601 may be a Central Processing Unit (CPU), and the Processor may also be other general purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, and the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The input device 602 may include a touch pad, a fingerprint sensor (for collecting fingerprint information of a user and direction information of the fingerprint), a microphone, etc., and the output device 603 may include a display (LCD, etc.), a speaker, etc.
The memory 604 may include both read-only memory and random access memory, and provides instructions and data to the processor 601. A portion of the memory 604 may also include non-volatile random access memory. For example, the memory 604 may also store device type information.
In a specific implementation, the processor 601, the input device 602, and the output device 603 described in this embodiment of the present invention may execute the implementation manners described in the first embodiment and the second embodiment of the photographing method provided in this embodiment of the present invention, and may also execute the implementation manner of the terminal described in this embodiment of the present invention, which is not described herein again.
In another embodiment of the present invention, a computer-readable storage medium is provided, the computer-readable storage medium storing a computer program comprising program instructions that when executed by a processor implement:
acquiring a preview image through a first camera and a second camera; wherein the preview image contains a target object to be photographed;
acquiring a first distance value between the first camera and the second camera in the vertical direction, acquiring a first visual angle value corresponding to the first camera and acquiring a second visual angle value corresponding to the second camera;
determining a target shooting distance value between the target object and the terminal according to the first distance value, the first visual angle value and the second visual angle value;
acquiring a shooting parameter corresponding to the target distance value according to a preset corresponding relation between the shooting distance value and the shooting parameter; wherein the shooting parameters at least comprise a focal length;
and if the shooting instruction is detected, shooting a picture according to the shooting parameters.
Optionally, the program instructions, when executed by the processor, implement:
calculating the length of a first line segment and a first included angle value according to the first distance value, the first visual angle value and the second visual angle value; a first end point of the first line segment is a first position of the first camera or a second position of the second camera, and a second end point of the first line segment is an intersection point of the target object and the supporting surface; if the first end point of the first line segment is the first position of the first camera, the first included angle is an included angle formed by the first line segment and the first camera in the gravity direction; if the first end point of the first line segment is the second position of the second camera, the first included angle is an included angle formed by the first line segment and the second camera in the gravity direction;
and calculating a target shooting distance value between the target object and the terminal according to the length of the first line segment and the first included angle value.
Optionally, after determining the target shooting distance value between the target object and the terminal according to the shooting height value and the target visual angle value, the program instructions when executed by the processor further implement:
determining the size information of a target shooting range corresponding to the target shooting distance according to the preset corresponding relation among the focal length of the lens, the shooting distance and the size information of the shooting range;
determining an image height value corresponding to the target object according to the target visual angle value and the shooting height value;
determining an image width value corresponding to the target object according to the size information of the target shooting range and the image height value corresponding to the target object;
and determining and outputting a trimming parameter corresponding to the target object in the picture according to the size information of the target shooting range, the image height value corresponding to the target object and the image width value corresponding to the target object.
Optionally, the number of the target objects is at least two; if the shooting instruction is detected, before the picture is shot according to the shooting parameters, the program instructions are executed by the processor to further realize:
obtaining an expected height difference between the first target object and the second target object;
and determining and prompting a moving direction and a moving distance value corresponding to the first target object or the second target according to a first image height value corresponding to the first target object, a second image height value corresponding to the second target object and the expected height difference value.
Optionally, the program instructions, when executed by the processor, implement:
calculating a first shooting height value corresponding to the first camera according to the first distance value, the first visual angle value and the second visual angle value, or calculating a second shooting height value corresponding to the second camera;
calculating a first image height value corresponding to the first target object and a second image height value corresponding to the second target object according to the first shooting height value or the second shooting height value;
and determining and prompting a moving direction and a moving distance value corresponding to the first target object or the second target according to a first image height value corresponding to the first target object, a second image height value corresponding to the second target object and the expected height difference value.
Optionally, the program instructions, when executed by the processor, implement:
if a shooting instruction is detected, acquiring the movement speed of the target object;
and if the movement speed is greater than zero, determining the target position of the target object according to the movement speed, adjusting the shooting parameters according to the target position, and shooting according to the adjusted shooting parameters.
Optionally, the program instructions when executed by the processor implement:
acquiring an illumination value of current ambient light, and determining an exposure value corresponding to the illumination value;
acquiring a focal length and an aperture value corresponding to the target distance value according to a preset corresponding relation between the shooting distance value and the shooting parameter;
and if the shooting instruction is detected, shooting a picture according to the exposure value corresponding to the illumination value, the focal length corresponding to the target distance value and the aperture value.
The computer readable storage medium may be an internal storage unit of the terminal according to any of the foregoing embodiments, for example, a hard disk or a memory of the terminal. The computer readable storage medium may also be an external storage device of the terminal, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like provided on the terminal. Further, the computer-readable storage medium may also include both an internal storage unit and an external storage device of the terminal. The computer-readable storage medium is used for storing the computer program and other programs and data required by the terminal. The computer readable storage medium may also be used to temporarily store data that has been output or is to be output.
Those of ordinary skill in the art will appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be embodied in electronic hardware, computer software, or combinations of both, and that the components and steps of the examples have been described in a functional general in the foregoing description for the purpose of illustrating clearly the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the terminal and the unit described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed terminal and method can be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may also be an electric, mechanical or other form of connection.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment of the present invention.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention essentially or partially contributes to the prior art, or all or part of the technical solution can be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
While the invention has been described with reference to specific embodiments, the invention is not limited thereto, and various equivalent modifications and substitutions can be easily made by those skilled in the art within the technical scope of the invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (7)

1. A shooting method based on double cameras is characterized by comprising the following steps:
acquiring a preview image through a first camera and a second camera; wherein the preview image contains a target object to be photographed; the target objects at least comprise a first target object and a second target object;
acquiring a first distance value between the first camera and the second camera in the vertical direction, acquiring a first visual angle value corresponding to the first camera and acquiring a second visual angle value corresponding to the second camera;
calculating the length of a first line segment and a first included angle value according to the first distance value, the first visual angle value and the second visual angle value; a first end point of the first line segment is a first position of the first camera or a second position of the second camera, and a second end point of the first line segment is an intersection point of any one target object and the supporting surface; if the first end point of the first line segment is the first position of the first camera, the first included angle is an included angle formed by the first line segment and the first camera in the gravity direction; if the first end point of the first line segment is the second position of the second camera, the first included angle is an included angle formed by the first line segment and the second camera in the gravity direction;
calculating a target shooting distance value between any one target object and a terminal according to the length of the first line segment and the first included angle value;
obtaining an expected height difference between the first target object and the second target object;
calculating a first shooting height value corresponding to the first camera according to the first distance value, the first visual angle value and the second visual angle value, or calculating a second shooting height value corresponding to the second camera;
calculating a first image height value corresponding to the first target object and a second image height value corresponding to the second target object according to the first shooting height value or the second shooting height value;
determining and prompting a moving direction and a moving distance value corresponding to the first target object or the second target according to a first image height value corresponding to the first target object, a second image height value corresponding to the second target object and the expected height difference value;
acquiring a shooting parameter corresponding to the target distance value according to a preset corresponding relation between the shooting distance value and the shooting parameter; wherein the shooting parameters at least comprise a focal length;
and if the shooting instruction is detected, shooting a picture according to the shooting parameters.
2. The photographing method according to claim 1, wherein after calculating the target photographing distance value between any one of the target objects and the terminal according to the length of the first line segment and the first included angle value, the photographing method further comprises:
determining the size information of a target shooting range corresponding to the target shooting distance according to the preset corresponding relation among the focal length of the lens, the shooting distance and the size information of the shooting range;
determining an image height value corresponding to the target object according to the target visual angle value and the shooting height value;
determining an image width value corresponding to the target object according to the size information of the target shooting range and the image height value corresponding to the target object;
and determining and outputting a trimming parameter corresponding to the target object in the picture according to the size information of the target shooting range, the image height value corresponding to the target object and the image width value corresponding to the target object.
3. The photographing method of claim 1, wherein if the photographing instruction is detected, the photographing according to the photographing parameters comprises:
if a shooting instruction is detected, acquiring the movement speed of the target object;
and if the movement speed is greater than zero, determining the target position of the target object according to the movement speed, adjusting the shooting parameters according to the target position, and shooting according to the adjusted shooting parameters.
4. The photographing method according to any one of claims 1 to 3, further comprising:
acquiring an illumination value of current ambient light, and determining an exposure value corresponding to the illumination value;
the acquiring of the shooting parameters corresponding to the target distance values according to the preset corresponding relationship between the shooting distance values and the shooting parameters includes:
acquiring a focal length and an aperture value corresponding to the target distance value according to a preset corresponding relation between the shooting distance value and the shooting parameter;
if the shooting instruction is detected, shooting the picture according to the shooting parameters comprises:
and if the shooting instruction is detected, shooting a picture according to the exposure value corresponding to the illumination value, the focal length corresponding to the target distance value and the aperture value.
5. A terminal, characterized in that it comprises means for performing the method according to any of claims 1-4.
6. A terminal, comprising a processor, an input device, an output device, and a memory, the processor, the input device, the output device, and the memory being interconnected, wherein the memory is configured to store a computer program comprising program instructions, the processor being configured to invoke the program instructions to perform the method of any of claims 1-4.
7. A computer-readable storage medium, characterized in that the computer storage medium stores a computer program comprising program instructions that, when executed by a processor, cause the processor to perform the method according to any of claims 1-4.
CN201711468892.8A 2017-12-28 2017-12-28 Photographing method based on double cameras, terminal and computer readable storage medium Active CN108200335B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711468892.8A CN108200335B (en) 2017-12-28 2017-12-28 Photographing method based on double cameras, terminal and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711468892.8A CN108200335B (en) 2017-12-28 2017-12-28 Photographing method based on double cameras, terminal and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN108200335A CN108200335A (en) 2018-06-22
CN108200335B true CN108200335B (en) 2020-01-14

Family

ID=62585931

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711468892.8A Active CN108200335B (en) 2017-12-28 2017-12-28 Photographing method based on double cameras, terminal and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN108200335B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111182198A (en) * 2018-11-13 2020-05-19 奇酷互联网络科技(深圳)有限公司 Shooting focusing method based on double cameras, mobile device and device
CN109361868B (en) * 2018-11-27 2020-09-25 浙江舜宇光学有限公司 Focusing method, photographing device and focusing device
CN109856617A (en) * 2019-01-24 2019-06-07 珠海格力电器股份有限公司 Image pickup method, device, processor and terminal based on microwave radar
CN112135088B (en) * 2019-06-25 2024-04-16 北京京东尚科信息技术有限公司 Method for displaying trial assembly effect, trial assembly terminal and storage medium
CN113194173A (en) * 2021-04-29 2021-07-30 维沃移动通信(杭州)有限公司 Depth data determination method and device and electronic equipment
CN114754707B (en) * 2022-04-18 2024-01-30 北京半导体专用设备研究所(中国电子科技集团公司第四十五研究所) Flatness detection method and level detection table for infrared detection chip
CN117156269A (en) * 2022-07-06 2023-12-01 惠州Tcl移动通信有限公司 Camera focusing method, device, electronic equipment and computer readable storage medium
CN117729320A (en) * 2024-02-07 2024-03-19 荣耀终端有限公司 Image display method, device and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103344213A (en) * 2013-06-28 2013-10-09 三星电子(中国)研发中心 Method and device for measuring distance of double-camera
CN105120148A (en) * 2015-08-14 2015-12-02 深圳市金立通信设备有限公司 An automatic focusing method and a terminal
CN105578024A (en) * 2015-05-27 2016-05-11 宇龙计算机通信科技(深圳)有限公司 Camera focusing method, focusing device and mobile terminal

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5712519B2 (en) * 2010-07-23 2015-05-07 株式会社リコー Imaging apparatus and imaging method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103344213A (en) * 2013-06-28 2013-10-09 三星电子(中国)研发中心 Method and device for measuring distance of double-camera
CN105578024A (en) * 2015-05-27 2016-05-11 宇龙计算机通信科技(深圳)有限公司 Camera focusing method, focusing device and mobile terminal
CN105120148A (en) * 2015-08-14 2015-12-02 深圳市金立通信设备有限公司 An automatic focusing method and a terminal

Also Published As

Publication number Publication date
CN108200335A (en) 2018-06-22

Similar Documents

Publication Publication Date Title
CN108200335B (en) Photographing method based on double cameras, terminal and computer readable storage medium
CN107920211A (en) A kind of photographic method, terminal and computer-readable recording medium
EP3326360B1 (en) Image capturing apparatus and method of operating the same
US9712751B2 (en) Camera field of view effects based on device orientation and scene content
RU2629436C2 (en) Method and scale management device and digital photographic device
KR102085766B1 (en) Method and Apparatus for controlling Auto Focus of an photographing device
TWI442328B (en) Shadow and reflection identification in image capturing devices
CN108605087B (en) Terminal photographing method and device and terminal
WO2016127671A1 (en) Image filter generating method and device
KR20150026268A (en) Method for stabilizing image and an electronic device thereof
US9106829B2 (en) Apparatus and method for providing guide information about photographing subject in photographing device
CN106454086B (en) Image processing method and mobile terminal
WO2017124899A1 (en) Information processing method, apparatus and electronic device
WO2018184260A1 (en) Correcting method and device for document image
TWI546726B (en) Image processing methods and systems in accordance with depth information, and computer program prodcuts
CN111968052A (en) Image processing method, image processing apparatus, and storage medium
CN108154090B (en) Face recognition method and device
TW201513661A (en) Photography device and adjusting system and adjusting method thereof
CN113866782A (en) Image processing method and device and electronic equipment
CN107155000B (en) Photographing behavior analysis method and device and mobile terminal
CN111182208B (en) Photographing method and device, storage medium and electronic equipment
CN107578006B (en) Photo processing method and mobile terminal
JP2017103657A (en) Imaging apparatus, imaging method, and computer program for imaging apparatus
CN107360361B (en) Method and device for shooting people in backlight mode
CN111325674A (en) Image processing method, device and equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20210224

Address after: 518057 Desai Science and Technology Building, 9789 Shennan Avenue, Yuehai Street, Nanshan District, Shenzhen City, Guangdong Province, 17th Floor (15th Floor of Natural Floor) 1702-1703

Patentee after: Shenzhen Microphone Holdings Co.,Ltd.

Address before: 518040 21 floor, east block, Times Technology Building, 7028 Shennan Road, Futian District, Shenzhen, Guangdong.

Patentee before: DONGGUAN GOLDEX COMMUNICATION TECHNOLOGY Co.,Ltd.